Are you using the best out of your device camera and GPU in your app?
Have you ever wondered how a view or camera frame is drawn on the screen?
Do you think your camera should do way more than just clicking a picture or recording a video?
Did you ever try to process the frames from camera hardware like applying filters or beautification before previewing it?
Android provides you with Camera APIs which are used for retrieving frames from Camera hardware. Those frames can be manipulated, modified, beautified in any manner before previewing or recording it.
You can even add your custom views like static images or even 3D face mask or any 3D object in the preview after detecting objects like the face in frames. This entire process is called rendering objects in camera preview frames.
Not only rendering, but we can also apply various filters like Sepia, grayscale and many more before previewing each frame.
These all features can be utilized in any android app using OpenGL.
how a view or camera frame is drawn on the screen?
What is OpenGL?
how OpenGL controls GPU rendering?
What are vertex and fragment shaders?
How do they affect the camera frame and preview?
How can we modify the camera frames between camera APIs and previewing?
What all can we do with that?
Well, let’s get all these questions answered in the talk along with our experiences and challenges we’ve faced.