I know what you’re thinking.
“Why would you ever make something like this? Are you actually insane?”
I’ll let you figure that one out.
Languages: Java with XML
Engine/API/Framework: Android Studio with Android N API
But for now, I really enjoyed making this project for a couple reasons:
- It was a completely new project: I’ve never worked in Java, XML, or Android Studio before. The challenge lied in learning all of that while also making something that I found interesting, even if it is sort of a joke.
- Mobile Apps are really neat: Mobile platform was very foreign to me, but realizing how simple it is to make something and throw it onto your phone sparked a passion in me for making more mobile apps in the future.
- I researched what apps are capable of doing for audio: This allows me to have a general understanding on what I can do with mobile audio programming. It’s pretty restricted depending on your device, but I didn’t plan on making surround sound AI sound for something on a phone.
- Now I can tamper with my phone and make a bunch of awesome apps to replace my lame ones
This app was pretty easy to set up (after the complications of Gradle and downloading packages until I had everything). The app consists of a Relative Layout with 8 buttons, text, and button formats that allow for the button to take multiple animations for pushed/unpushed/idle/etc. Each button has a listener for when it is pushed so it can play the sound that is mapped to it.
For aligning things properly with little work, a Relative Layout is perfect. All of the items on your screen are placed in relation to the other objects, which allows you to easily make sure nothing is stealing anything else’s screen real estate. Just place any objects within the XML code for the relative layout header. Most of the code in the design of the app itself is handled through Android Studio’s helpful UI, so all these commands may seem daunting at first, but really you’re changing a few values post-selection.
Now for the actual java backend, still pretty simple. All of the audio clips used are managed through an Audio Manager and Sound Pool.
The AudioManager maintains volume. It can do more for you, but this is what it does for my app. The SoundPool is the meat of this. SoundPools are containers that track what audio is playing. You can use a MediaPlayer for basic sound, but I needed to be able to play multiple sounds at once. You load all your clips into the pool, save the IDs of the loaded clips, and play via their ID. As of Android’s Lollipop update, creating a SoundPool has been changed, and using the basic constructor has become deprecated. The following code can be used to determine your target API and builds your SoundPool accordingly. Your usage and content type may differ depending on your implementation. Set your SoundPool’s max streams to how many you deem necessary, for that number will be the cap, and when you play more than that amount, the pool will remove your streams based on priority and age.
The app then just maps a listener function to all the buttons, and tracks which button was pressed to play the associated audio clip. Mapping the listener is as easy as writing the function, and in the design UI for the XML layout you just look at the button attributes and assign the function to OnClick.
Going forward with this project, I would like to clean up the layout a little more (the snapping to the relative layout was fighting with me), and add multiple activities for a settings menu and upload menu. I would like to turn this app into a customized soundboard, instead of just a silly toy. I had a lot of fun learning how to make apps, and I hope to create more in the future, and also explore uploading these kind of things to the Google Play Store.
Until Next Time!