
Just as we see in Windows 11, Google is working to give Android apps semantic (programmatic) capabilities so that they can be more easily controlled by AI agents. Today, the online giant gave developers a first glimpse of how that will work.
“User expectations for AI on their devices are fundamentally shifting how they interact with their apps,” Google vice president of Android development Matthew McCullough writes. “Instead of opening apps to do tasks step-by-step, they’re asking AI to do the heavy lifting for them. In this new interaction model, [developer] success is shifting from getting users to open your app, to successfully fulfilling their tasks and helping them get more done faster.”
Key to this shift is a coming Android capability called AppFunctions. Basically, AppFunctions allow Android apps to expose public interfaces for specific functionality that can be consumed by AI agents and other system-level services.
Or, as Google puts it, “AppFunctions allow your Android app to share specific pieces of functionality that the system and various AI agents and assistants can discover and invoke. By defining these functions, you enable your app to provide services, data, and actions to the Android OS, allowing users to complete tasks through AI agents and system-level interactions.”
Put yet another way, AppFunctions are to Android what the Model Context Protocol (MCP) is to cloud-based AI interconnectivity, a standardized way for agents to interact with mobile apps. Google exposes these capabilities through its Jetpack library and platform APIs, and all interacts occur locally on the device.
Despite a warning that AppFunctions are an early-stage developer capability, Google has already deployed the first examples of this technology in the version of Gemini that will soon ship on the Samsung Galaxy S26 series phones, and it will expand this functionality to other Samsung devices running OneUI 8.5 and higher in the near future. There, Gemini can interact with Calendar, Notes, and Tasks using backend AppFunctions, helping users streamline activities across those apps.
To get feedback and help developers get acquainted with AppFunctions, Google is rolling out an early preview for developers via a beta feature in the Gemini app on Galaxy S26 series and select Pixel 10 devices. Users can delegate multi-step tasks to agents through Gemini by double-pressing the power button, and this initial release will include support for “a curated selection of apps in the food delivery, grocery, and rideshare categories in the US and Korea to start.”
AppFunctions will be part of Android 17, McCullough adds, giving the initial stable release a rough mid-year schedule. He says there’s a lot more information to come.

















