Dear students,
Last time we met we tried something new. I assigned in-class exercises, you worked on them, and that’s all we had time for. We will discuss and implement your solutions today, which will wrap up our work on the Backlog app.
I’d like to keep getting you involved in the writing of our apps’ code, but I’m not sure that doing that in-class is the best idea. Instead, I will assign the exercises for you to complete on your quartersheets before next lecture. Then when we meet, we’ll discuss your solutions as we assemble them. The exercises will now appear at the end of these notes.
Following are my solutions to the exercises from last time.
loadDayPhotos
, which sets the list’s adapter to hold the photos from all years that have a photo for the current month and day. If the current date is 29 April and the directory is as shown above, then after this method runs, the photo list will have a new adapter holding the photos backlog/2019/04_29.jpg
and backlog/2018/04_29.jpg
. Keep this short with higher-order functions and lambdas. No loops are necessary. (< 10 lines of Kotlin) private fun loadDayPhotos() {
if (photoDirectory.exists()) {
val photos = photoDirectory
.listFiles { file, _ -> file.isDirectory }
.map { Photo(File(it, String.format("%02d_%02d.jpg", month, day))) }
.filter { it.file.exists() }
photosList.adapter = PhotoAdapter(photos)
}
}
dayFile
, which accepts a year, a month, and a day, each an Int
, and does two things: it returns a File
pointing to the photo in the directory structure described above for the specified date, and it creates any missing parent directories that contain the file. (< 5 lines of Kotlin) private fun dayFile(year: Int, month: Int, day: Int): File {
val file = File(photoDirectory, String.format("$year/%02d_%02d.jpg", month, day))
file.parentFile.mkdirs()
return file
}
FileProvider
for the app, which we need because we want the camera activity to save our photo to our photo directory. Historically, we just sent a file:///
-prefixed URI as an extra to specify the write location. But recent versions of Android outlaw sending raw URIs because they are a security risk. Provide files in the backlog
directory in external storage. (< 10 lines of unwrapped XML) Manifest:
<provider
android:name="androidx.core.content.FileProvider"
android:authorities="org.twodee.backlog.fileprovider"
android:exported="false"
android:grantUriPermissions="true">
<meta-data
android:name="android.support.FILE_PROVIDER_PATHS"
android:resource="@xml/paths" />
</provider>
xml/paths.xml
:
<paths
xmlns:android="http://schemas.android.com/apk/res/android">
<external-path
name="backlog_images"
path="backlog" />
</paths>
dayUri
, which accepts a year, a month, and a day, each an Int
, and does two things: it returns a Uri
pointing to the photo in the directory structured described above for the specified date, and it creates any missing parent directories that contain the file. The URI that you create must be shareable with the camera activity. Uri.fromFile
does not create shareable URIs. Use a FileProvider
to generate the URI. (< 5 lines of Kotlin) private fun dayUri(year: Int, month: Int, day: Int): Uri {
val file = dayFile(year, month, day)
val uri = FileProvider.getUriForFile(this, "org.twodee.backlog.fileprovider", file)
return uri
}
takePictureFromCamera
, which fires off an intent for the capturing an image—but only if there is an activity that can handle it. Pass along the URI pointing to the current day’s photo location. Use REQUEST_CAMERA
for the request code. (< 10 lines of Kotlin) private fun takePictureFromCamera() {
val intent = Intent(MediaStore.ACTION_IMAGE_CAPTURE)
intent.resolveActivity(packageManager)?.let {
val uri = dayUri(year, month, day)
intent.putExtra(MediaStore.EXTRA_OUTPUT, uri)
startActivityForResult(intent, REQUEST_CAMERA)
}
}
takePictureFromGallery
, which fires off an intent for choosing an image from the gallery. Use REQUEST_GALLERY
for the request code. (< 10 lines of Kotlin) private fun takePictureFromGallery() {
val intent = Intent(Intent.ACTION_GET_CONTENT)
intent.addCategory(Intent.CATEGORY_OPENABLE)
intent.type = "image/*"
startActivityForResult(intent, REQUEST_GALLERY)
}
copyUriToUri
, which accepts a source Uri
and a destination Uri
. It copies the content of the source Uri
to the destination Uri
. Use ContentResolver
to deal with the Uri
s. Kotlinisms like use
and its java.io
extension functions can make this short. (< 5 lines of Kotlin) private fun copyUriToUri(from: Uri, to: Uri) {
contentResolver.openInputStream(from).use { input ->
contentResolver.openOutputStream(to).use { output ->
input.copyTo(output)
}
}
}
onActivityResult
. If the camera activity succeeds, reload the current day’s photos. If the gallery activity succeeds, copy the Uri
it returns to the current day’s Uri
and reload the current day’s photos. Otherwise, defer to the superclass. (< 20 lines of Kotlin, many of which are curly braces) override fun onActivityResult(requestCode: Int, resultCode: Int, data: Intent?) {
when (requestCode) {
REQUEST_CAMERA -> {
if (resultCode == Activity.RESULT_OK) {
loadDayPhotos()
}
}
REQUEST_GALLERY -> {
if (resultCode == Activity.RESULT_OK) {
data?.data?.let { uri ->
copyUriToUri(uri, dayUri(year, month, day))
loadDayPhotos()
}
}
}
else -> {
super.onActivityResult(requestCode, resultCode, data)
}
}
}
Let’s add one extra feature to our app: speech recognition. We will allow the user to advance the day by speaking “next” or “previous.” This feature will be an artificial appendage for our app, as voice recognition confers no great benefit unless it allows the user to do something faster or something that couldn’t otherwise be done. Neither of those situations applies here.
There are a couple of ways to get voice recognition going. Both require the android.permission.RECORD_AUDIO
permission, which is classified as a dangerous permission.
The first way to fire up speech recognition is to trigger an intent:
val intent = Intent(RecognizerIntent.ACTION_RECOGNIZE_SPEECH).apply {
putExtra(RecognizerIntent.EXTRA_LANGUAGE_MODEL, RecognizerIntent.LANGUAGE_MODEL_FREE_FORM)
}
startActivityForResult(intent, REQUEST_SPEECH_RECOGNITION)
The list of possible parses come to as a list in onActivityResult
, which we can process with this clause in our when
:
REQUEST_SPEECH_RECOGNITION -> {
if (resultCode == Activity.RESULT_OK) {
data?.getStringArrayListExtra(RecognizerIntent.EXTRA_RESULTS)?.forEach {
if (it == "next") {
nextDay()
} else if (it == "previous") {
previousDay()
}
}
}
}
The trouble with the intent-based approach is that it takes over the UI. Our app doesn’t appear to be in control anymore. The second way to fire up speech recognition is to create a SpeechRecognizer
and register a listener:
val intent = Intent(RecognizerIntent.ACTION_RECOGNIZE_SPEECH).apply {
putExtra(RecognizerIntent.EXTRA_LANGUAGE_MODEL, RecognizerIntent.LANGUAGE_MODEL_FREE_FORM)
putExtra(RecognizerIntent.EXTRA_CALLING_PACKAGE, packageName)
putExtra(RecognizerIntent.EXTRA_PARTIAL_RESULTS, true)
}
speechRecognizer = SpeechRecognizer.createSpeechRecognizer(this).apply {
setRecognitionListener(object : RecognitionListener {
override fun onReadyForSpeech(p0: Bundle?) {}
override fun onRmsChanged(p0: Float) {}
override fun onBufferReceived(p0: ByteArray?) {}
override fun onPartialResults(p0: Bundle?) {}
override fun onEvent(p0: Int, p1: Bundle?) {}
override fun onBeginningOfSpeech() {}
override fun onEndOfSpeech() {}
override fun onError(p0: Int) {}
override fun onResults(bundle: Bundle?) {
bundle?.getStringArrayList(SpeechRecognizer.RESULTS_RECOGNITION)?.forEach {
if (it == "next") {
nextDay()
} else if (it == "previous") {
previousDay()
}
}
startListening(intent)
}
})
startListening(intent)
}
Because speech recognition is likely to be an intensive operation and send chunks of audio data across the network, it is not meant to run continuously. We defy that principle here by restarting the listener after an utterance has been processed.
The next app we work on is a prank app called Lonely Phone. The premise is simple: when the phone lies flat, it will start to ring—as if someone were calling. As soon as the phone is picked up, the ringing stops. The prankish idea is to install the app on someone else’s phone, place the phone on an incline, and then wait for the owner to lay it flat at some future time. The joke won’t be very surprising if the app’s UI is visible, so we’ll listen to the gravity sensor in a way that doesn’t require the UI.
Our app will be comprised of two major actors: a main activity and a service. The main activity starts off with a UI containing a single Switch
widget for turning the service on and off. We’ll start with this code, which just loads the UI:
class MainActivity : Activity() {
private lateinit var cryWhenLonelySwitch: Switch
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
setContentView(R.layout.activity_main)
cryWhenLonelySwitch = findViewById(R.id.cryWhenLonelySwitch)
}
}
The service that will listen to the gravity sensor without needing the UI will start with this code:
class CryService : Service() {
private var player: MediaPlayer? = null
private lateinit var sensorManager: SensorManager
}
Become an expert on your particular corner of the app, investigating background material as needed. Build on top of the activity, service, and others’ code as you complete your exercise. Write your solution on a quarter sheet of paper. The exercises are as follows:
MainActivity
, declare read-only property serviceIntent
that returns an explicit intent for starting up the service.MainActivity
, register in onCreate
a callback on the switch that starts the service as a foreground service when on, and stops the service when off.MainActivity
, write method createNotificationChannel
that creates a notification channel for alarms of high importance.CryService
, write method onBind
such that no client can bind to this service.CryService
, write method startRinging
to start playing the phone’s default ringtone on loop.CryService
, write method stopRinging
to stop playing the phone’s default ringtone and release any associated resources.CryService
, define field gravityListener
as a SensorEventListener
. When it detects that the phone is flat (either on its screen or back) but it wasn’t flat before, it starts the phone ringing. When it detects the phone is not flat but it was before, it stops the ringing.CryService
, write method becomeForegroundService
that makes this service a foreground service.CryService
, write method onStartCommand
that turns this service into a foreground service and registers our gravity sensor listener. Investigate the possible return values.CryService
, write method onDestroy
that unregisters the gravity sensor listener and stops any ringing.MainActivity
first starts up, we need to set the initial state of the switch to on or off. If the service is running, we want it on. But there’s no builtin way to query whether the service is running. Instead, we can set up a within-app broadcast. In MainActivity
, add code to onCreate
that creates a local broadcast manager. Register a new BroadcastReceiver
responds only to action "pong"
. In its onReceive
, toggle the switch on. Then send a synchronous broadcast of an Intent
whose action is "ping"
.CryService
, write method onCreate
to create a local broadcast manager. Register a new BroadcastReceiver
that responds only to action "ping"
. In its onReceive
method send a synchronous broadcast of an Intent
whose action is "pong"
.See you next time!
P.S. It’s time for a haiku!
Last child, least photos
But I won’t be forgotten
I have Big Brother
Comments