Android Performance Patterns: Understanding VSYNC

Unbeknown to most developers, there’s a simple hardware design that defines everything about how fast your application can draw things to the screen.

You may have heard the term VSYNC – VSYNC stands for vertical synchronization and it’s an event that happens every time your screen starts to refresh the content it wants to show you.

Effectively, VSYNC is the product of two components Refresh Rate (how fast the hardware can refresh the screen), and Frames Per Second (how fast the GPU can draw images), and in this video +Colt McAnlis walks through each of these topics, and discusses where VSYNC (and the 16ms rendering barrier) comes from, and why it’s critical to understand if you want a silky smooth application.

Watch more Android Performance Patterns here:

#develop, #performance, #render, #vsync


Xem thêm bài viết khác:

8 responses to Android Performance Patterns: Understanding VSYNC

  1. I love these new videos! Short and to the point! Perfect for a refresh or quick learning session. What about linking a relative IO talk for a deeper understanding?

  2. Great set of videos, Colt. Takes me back twenty years to the Atari ST and Amiga where these techniques were standard practice. In those days (how old do I sound?!) There was no "copy" process to move screen data from the back buffer to the screen buffer… You just told the GPU where in memory to get its "next" frame from, flipping an address register between two frame addresses each time a vertical sync occurred, rather than copying a whole screen's worth of data. Presumably Android does the same? It's far more efficient.

  3. Please fix the lag in your apps. Calender is horribly janky while scrolling, maps, play music (!), play store (!) and most others.. So hard to fix these? I think with such powerful smartphones we shouldn't deal with those issues anymore.

  4. I'd like to point out an error: the video claims the solution to tearing is "double buffering", but it isn't. Regardless of vsync, applications almost always use double buffering and draw into back buffer anyway (otherwise you see triangles being drawn by the GPU as it happens). The tearing happens NOT because the GPU is only using a single buffer as the video claims, but because the back buffer is swapped with front buffer while the monitor is still in the middle presenting the front buffer.

    So the real solution is to synchronize the presentation of back buffer so that it happens during vertical blank interval (ie when the monitor is finished drawing the last scanline and preparing to draw the first scanline for the next frame, which happens every 60th of second on typical PC and mobile displays). AKA vsync.

    Vsync with double buffering (which was norm prior to Jelly Bean) has a nasty problem of halving your framerate when you miss your target frequency. JB introduced triple buffering (which uses 2 back buffers and presents whichever one is ready, and the GPU draws into whichever one isn't presented yet) and it solves that nasty problem. However it introduces more input lag as the buffers struggle to catch up, and when it finally does, it manifests as microstutter or "jank." Unless you're able to hit 1/60 sec target all the time, you'll never be able to solve either of these issues.

    Here's hoping freesync (nvidia's gsync equivalent) comes to Android in the future.


Your email address will not be published. Required fields are marked *