Issue
I wonder how they fill the "played" part of the view that represents an audio record. Here is an example from Telegram:
The initial color of the view representing an audio record is grey. The played part is filled with blue.
How would I implement such a thing? I would store two images. The first one would represent the audio record in the not-played state, and the second one would represent the same record in the played state.
The not-played state:
The played state:
Then I would create a FrameLayout
with two View
s and use the clip drawable API to gradually reveal the played part. Here is my code:
clipped_view_animator.xml
<?xml version="1.0" encoding="utf-8"?>
<objectAnimator xmlns:android="http://schemas.android.com/apk/res/android"
android:duration="15000"
android:propertyName="level"
android:valueTo="10000"
android:valueType="intType" />
played_clip.xml
<?xml version="1.0" encoding="utf-8"?>
<clip xmlns:android="http://schemas.android.com/apk/res/android"
android:clipOrientation="horizontal"
android:drawable="@drawable/played"
android:gravity="left" />
activity_main.xml
<?xml version="1.0" encoding="utf-8"?>
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:layout_margin="20dp"
android:orientation="vertical"
tools:context="ru.maksim.sample.MainActivity">
<Button
android:id="@+id/play"
android:text="Play"
android:layout_width="wrap_content"
android:layout_height="wrap_content" />
<FrameLayout
android:layout_width="match_parent"
android:layout_height="match_parent">
<ImageView
android:id="@+id/view"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:background="@drawable/not_played" />
<ImageView
android:id="@+id/clippedView"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:background="@drawable/played_clip" />
</FrameLayout>
</LinearLayout>
MainActivity.kt
class MainActivity : AppCompatActivity() {
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
setContentView(R.layout.activity_main)
play.setOnClickListener({ play() })
}
private fun play() {
val clippedDrawable = clippedView.background as ClipDrawable
Log.d(TAG, "level=${clippedDrawable.level}")
val animator = AnimatorInflater.loadAnimator(this, R.animator.clipped_view_animator)
animator.setTarget(clippedDrawable)
animator.start()
}
}
Video: https://youtu.be/9X8Yb9aKqmQ
My method works. However, I wonder if there are better ways to do what I do (in terms of performance, the number of lines of code, etc.). By the way, it's clear from my video that I exceed the 16 milliseconds per frame limit (Samsung Galaxy Tab 4 with Android 5.0.2). Might be worse in a real-world app.
Solution
Your codes seems to perform the function perfectly. However, sometimes libraries are more efficient than native code. You can refer these libraries to implement your feature.
- WaveForm
- Yalantis Horizon Wave
- Audio-Recorder-Visualization
- Semantive Waveform
- WaveInApp
- WaveformControl
The library doesn't have the exact same implementation and you might have to tweak it to meet your requirements. I found a similar question which helped me while I was working on a similar project. The answer uses native code to show Audio waveform. You can refer to this answer.
Answered By - Abhi
0 comments:
Post a Comment
Note: Only a member of this blog may post a comment.