How it works


Jiny provides live contextual guidance in vernacular audio within mobile apps. It can identify what’s happening on live user screen (context) and correspondingly guide them with the right information at the right time.


Jiny can identify what’s happening on the live user screen (the context) by detecting what all UI elements are present on the screen. When anything on the user’s screen changes i.e either the properties of the elements change, or elements are created/removed, Jiny can identify the new changed context. We call these contexts, ‘stages’.

When a certain stage is identified, the corresponding instruction is played in audio along with a visual indicator (pointer) on the screen.