WWDC 2024 Apple Intelligence

Tim Cook: At Apple, it’s always been our goal to design powerful personal products that enrich people’s lives by enabling them to do the things that matter most, as simply and easily as possible. We’ve been using artificial intelligence and machine learning for years to help us further that goal. Recent developments in generative intelligence and large language models offer powerful capabilities that provide the opportunity to take the experience of using Apple products to new heights. So, as we look to build in these incredible new capabilities, we want to ensure that the outcome reflects the principles at the core of our products. It has to be powerful enough to help with the things that matter most to you. It has to be intuitive and easy to use. It has to be deeply integrated into your product experiences. Most importantly, it has to understand you and be grounded in your personal context, like your routine, your relationships, your communications, and more. And, of course, it has to be built with privacy from the ground up. Together, all of this goes beyond artificial intelligence. It’s personal intelligence, and it’s the next big step for Apple. [Music] Introducing Apple Intelligence, the new personal intelligence system that makes your most personal products even more useful and delightful. To tell you all about it, here’s Craig.
Craig Federighi: This is a moment we’ve been working towards for a long time. We are tremendously excited about the power of generative models. And there are already some really impressive chat tools out there that perform a vast array of tasks using world knowledge. But these tools know very little about you or your needs. With iOS 18, iPadOS 18, and macOS Sequoia, we are embarking on a new journey to bring you intelligence that understands you. Apple Intelligence is the personal intelligence system that puts powerful generative models right at the core of your iPhone, iPad, and Mac. It draws on your personal context to give you intelligence that’s most helpful and relevant for you. It protects your privacy at every step. And it is deeply integrated into our platforms and throughout the apps you rely on to communicate, work, and express yourself. Let’s take a closer look at Apple Intelligence starting with its incredible capabilities. Then, we’ll tell you about its unique architecture. And after that, we’ll show you how it elevates so many of your everyday experiences. Let’s begin with capabilities.
Apple Intelligence will enable your iPhone, iPad, and Mac to understand and create language, as well as images, and take action for you to simplify interactions across your apps. And what’s truly unique is its understanding of your personal context. Language and text are fundamental to how we communicate and work. And the large language models built into Apple Intelligence deliver deep natural language understanding, making so many of your day-to-day tasks faster and easier. For example, your iPhone can prioritize your notifications to minimize unnecessary distractions, while ensuring you don’t miss something important. Apple Intelligence also powers brand-new Writing Tools that you can access systemwide to feel more confident in your writing. Writing Tools can rewrite, proofread, and summarize text for you, whether you are working on an article or blog post, condensing ideas to share with your classmates, or looking over a review before you post it online. And they are available automatically across Mail, Notes, Safari, Pages, Keynote, and even your third-party apps. In addition to language, Apple Intelligence offers a host of capabilities for images. From photos, to emojis, and GIFs, it’s so much fun to express ourselves visually. And now you can create totally original images to make every day conversations even more enjoyable. And because Apple Intelligence understands the people in your photo library, you can personalize these images for your conversations. So, when you wish a friend a happy birthday, you can create an image of them surrounded by cake, balloons, and flowers to make it extra festive. And the next time you tell Mom that she’s your hero, you can send an image of her in a superhero cape to really land your point. You can create images in three unique styles: Sketch, Illustration, and Animation.
In addition to Messages, this experience is built into apps throughout the system, like Notes, Freeform, Keynote, and Pages. Another way Apple Intelligence is deeply impactful is its ability to take action across your apps. The greatest source of tools for taking actions is already in your pocket with the apps you use every day. And we have designed Apple Intelligence so it can tap into these tools and carry out tasks on your behalf. So, you can say things like, “Pull up the files that Joz shared with me last week,” or, “Show me all the photos of Mom, Olivia, and me,” or, “Play the podcast that my wife sent the other day.” We are designing Apple Intelligence to be able to orchestrate these and hundreds of other actions for you, so you can accomplish more while saving time. There’s one more critical building block for personal intelligence, and that’s an understanding of your personal context. Apple Intelligence is grounded in your personal information and context with the ability to retrieve and analyze the most relevant data from across your apps, as well as to reference the content on your screen, like an email or calendar event you are looking at. This can be incredibly useful in so many moments throughout the day. Suppose one of my meetings is being re-scheduled for late in the afternoon, and I’m wondering if it’s going to prevent me from getting to my daughter’s play performance on time. Apple Intelligence can process the relevant personal data to assist me. It can understand who my daughter is, the play details she sent several days ago, the time and location for my meeting, and predicted traffic between my office and the theater.
Understanding this kind of personal context is essential for delivering truly helpful intelligence. But it has to be done right. You should not have to hand over all the details of your life to be warehoused and analyzed in someone’s AI cloud. With Apple Intelligence, powerful intelligence goes hand in hand with powerful privacy. Let me tell you more about its architecture, and how it is built with privacy at the core. The cornerstone of the personal intelligence system is on-device processing. We have integrated it deep into your iPhone, iPad, and Mac and throughout your apps, so it’s aware of your personal data, without collecting your personal data. This is only possible through our unique integration of hardware and software, and our years-long investment in building advanced silicon for on-device intelligence. Deeply-integrated generative models require immense processing power. And with our most advanced Apple silicon, the A17 Pro and M-family of chips, we have the computational foundation to power Apple Intelligence. This personal intelligence system is comprised of highly-capable large language and diffusion models that are specialized for your everyday tasks, and can adapt on the fly to your current activity. It also includes an on-device semantic index that can organize and surface information from across your apps. When you make a request, Apple Intelligence uses its semantic index to identify the relevant personal data, and feeds it to the generative models so they have the personal context to best assist you. Many of these models run entirely on device. There are times, though, when you need models that are larger than what fits in your pocket today. Servers can help with this. But traditionally, servers can also store your data without you realizing it, and use it in ways you did not intend. And since server software is only accessible to its owners, even if a company says it’s not misusing your data, you are unable to verify their claim, or if it changes over time.
In contrast, when you use an Apple device like your iPhone, you are in control of your data, where it is stored, and who can access it. And because the software image for your iPhone is accessible to independent experts, they can continuously verify its privacy. We want to extend the privacy and security of your iPhone into the cloud to unlock even more intelligence for you. So, we have created Private Cloud Compute. Private Cloud Compute allows Apple Intelligence to flex and scale its computational capacity, and draw on even larger, server-based models for more complex requests, while protecting your privacy. These models run on servers we have especially created using Apple silicon. These Apple silicon servers offer the privacy and security of your iPhone from the silicon on up, draw on the security properties of the Swift programming language, and run software with transparency built in. When you make a request, Apple Intelligence analyzes whether it can be processed on-device. If it needs greater computational capacity, it can draw on Private Cloud Compute, and send only the data that’s relevant to your task to be processed on Apple silicon servers. Your data is never stored or made accessible to Apple. It’s used exclusively to fulfill your request. And just like your iPhone, independent experts can inspect the code that runs on these servers to verify this privacy promise. In fact, Private Cloud Compute cryptographically ensures your iPhone, iPad, and Mac will refuse to talk to a server unless its software has been publicly logged for inspection. This sets a brand-new standard for privacy in AI, and unlocks intelligence you can trust. So that’s a look at the powerful capabilities of Apple Intelligence and its groundbreaking privacy protections. Now we’d love to show you how it will transform your apps and experiences across iOS 18, iPadOS 18, and macOS Sequoia, from a big leap forward for Siri, to powerful tools for writing and communication, and fun visual ways to express yourself. Let’s start with Siri. Here’s Kelsey to tell you more.
Kelsey Peterson: Today, Siri helps you get everyday tasks done quickly and easily. In fact, Siri users make 1.5 billion voice requests every single day. Thirteen years ago, we introduced Siri. The original intelligent assistant. And we had an ambitious vision for it. We’ve been steadily building towards that vision. And now, thanks to the incredible power of Apple Intelligence, we have the foundational capabilities to take a major step forward. So we can make Siri more natural, more contextually relevant, and of course, more personal to you. Right off the bat, you’ll see Siri’s got a new look. Let me show you. When you talk to Siri, you’ll notice it’s more deeply integrated into the system experience, with this elegant glowing light that wraps around the edge of your screen. And you can speak to Siri more naturally thanks to richer language understanding capabilities. Even if I stumble over my words, Siri understands what I’m getting at.
What does the weather look like for tomorrow at Muir Beach? Oh, wait, I meant Muir Woods!
Siri: The forecast is calling for clear skies in the morning near Muir Woods National Monument.
Kelsey Peterson: Sometimes it takes me a beat to figure out what I actually want to ask Siri, and now it follows right along. Siri also maintains conversational context, so I can follow up and say.
Siri: Hike is scheduled for 9:00 a.m. to 11:00 a.m. on June 11th.
Kelsey Peterson: I didn’t have to mention Muir Woods again. Siri understood what I meant when I said “there.” There are also certain times when you might not want to speak to Siri out loud. What’s great is that now, at any time, you have the option to type to Siri. With just a double tap at the bottom of the screen, I can quickly and quietly ask Siri to set an alarm. And you can switch between text and voice, communicating in whatever way feels right for the moment. We’re also laying the groundwork for some brand-new ways that Siri will be able to support you, one of which is its extensive product knowledge. Siri now holds a great deal of information about features and settings and can answer thousands of questions when you want to know how to do something on your iPhone, iPad, or Mac. Even if you don’t know exactly what a feature is called, you can just describe it and Siri will find the info you’re looking for. Like this: “How can I write a message now and have it be delivered tomorrow?” Siri understood what feature I was referring to, and now I have step-by-step guidance on how to use the new Send Later feature in Messages. Everything I’ve showed you so far will be available from the moment you start using Apple Intelligence. And over the course of the next year, we will be rolling out more features that make Siri even more personal and capable.
For one, Apple Intelligence will provide Siri with on-screen awareness, so it’ll be able to understand and take action with things on your screen. For example, say, a friend texts you his new address. Right from the Messages thread, you can say, “Add this address to his contact card,” and Siri will take care of it. Siri will also understand more of the things you get done in your apps. And with new orchestration capabilities provided by Apple Intelligence, Siri will take actions inside apps on your behalf. Siri will have the ability to take hundreds of new actions in and across apps, including some that leverage our new writing and image generation capabilities. For example, you’ll be able to say, “Show me my photos of Stacey in New York wearing her pink coat,” and Siri will bring those right up. Then you might say, “Make this photo pop,” and Siri will enhance it, just like that. And Siri will be able to take actions across apps, so you could say, “Add this to my note with Stacey’s bio,” and it will jump from the Photos app to the Notes app to make it happen. This is going to bring us closer to realizing our vision in which Siri moves through the system in concert with you. This is made possible through significant enhancements that we are making to App Intents, a framework that lets apps define a set of actions for Siri, Shortcuts, and other system experiences. And this won’t be limited to apps made by Apple. For developers, they’ll be able to use the App Intents framework to define actions in their apps and tap into Apple Intelligence too. So, you might ask Siri to take a light trails video in Pro Camera by Moment. Or ask Siri to share a summary of your meeting notes in an email you’re drafting to a teammate in Superhuman. And this is only the beginning. Siri will be able to understand and take more actions in more apps over time.
There’s one more set of really cool and useful capabilities coming to Siri. Thanks to Apple Intelligence, it has awareness of your personal context. With its semantic index of things like photos, calendar events, and files, plus information that’s stashed in passing messages and emails, like hotel bookings, PDFs of concert tickets, and links that your friends have shared, Siri will find and understand things it never could before. And with the powerful privacy protections of Apple Intelligence, Siri will use this information to help you get things done without compromising your privacy. You’ll be able to ask Siri to find something when you can’t remember if it was in an email, a text, or a shared note, like some book recommendations that a friend sent you a while back. Or for times when you’re filling out a form and need to input your driver’s license, Siri will be able to find a photo of your license, extract your ID number, and type it into the form for you. I want to show you one more demo that will give you a sense for how powerful Siri will be when it draws on the personal context awareness and action capabilities built into Apple Intelligence. Imagine that I am planning to pick my mom up from the airport, and I’m trying to figure out my timing. Siri is going to be able to help me do this so easily. Siri, when is my mom’s flight landing? What’s awesome is that Siri actually cross-references flight details that my mom shared with me by email with real-time flight tracking to give me her up-to-date arrival time. What’s our lunch plan? I don’t always remember to add things to my calendar, and so I love that Siri can help me keep track of plans that I’ve made in casual conversation, like this lunch reservation my mom mentioned in a text. How long will it take us to get there from the airport? I haven’t had to jump from Mail to Messages to Maps to figure out this plan. And a set of tasks that would have taken minutes on my own and honestly probably would have resulted in a call to my mom could be addressed in a matter of seconds. That’s just a glimpse of the ways in which Siri is going to become more powerful and more personal thanks to Apple Intelligence. And all of these updates to Siri are also coming to iPad and Mac, where Siri’s new design is a total game-changer. It makes Siri feel seamlessly integrated with your workflow. Thanks to the capabilities of Apple Intelligence, this year marks the start of a new era for Siri. Here’s Justin to show you more places throughout the system where Apple Intelligence simplifies and accelerates your tasks.
Justin Titi: Apple Intelligence unlocks incredible new ways to enhance your writing, whether you are tidying up your hastily-written class notes, ensuring your blog post reads just right on WordPress, or making sure your email is perfectly crafted. Let’s use Mail to take a closer look at how the systemwide Writing Tools can help you communicate even more effectively. Rewrite gives you different versions of what you have written, so you can choose the one you like best. This is great for making sure your cover letter for that job you’re excited for lands perfectly. And suggestions are shown inline, so you can go with the combination of flow and wording that works for you. Rewrite also helps you get the tone right. Have you ever re-read a work email that you just wrote and thought, “Oh, this might not go over well”? Well, now you can change the tone of that response to your colleague to make it sound more friendly, professional, or concise. You can also describe how you’d like it rewritten. For example, you can invite your friends to a get-together with a one-of-a-kind invitation written as a poem. Who could say no to that? Another way Writing Tools can help you is with Proofread. Say you’re emailing your English professor. With Proofread, you can nail grammar, word choice, and sentence structure to put your best foot forward. You can review suggested edits and their explanations individually, or accept them all with a click. And if you are about to email a project status that has gotten quite long, use Summarize to bring out the key points, and then add them as a TL;DR right at the top. In addition to Mail, you can access Writing Tools systemwide, nearly everywhere you write, including third-party apps. Apple Intelligence also powers Smart Reply in Mail. For example, when you need to RSVP to an event, you will now see suggestions for your response based on the email. If you say you’ll be there, Mail identifies questions you were asked in the invite, and offers intelligent selections so you can quickly choose your responses. Your drafted response incorporates your answers. So, with just a few taps, you’re ready to send it off with all the right details.
Finally, let’s talk about how Apple Intelligence helps you stay on top of a busy inbox. We all deal with sorting through a ton of email every day. And now it is easier and faster than ever to browse your inbox. Instead of previewing the first few lines of each email that don’t always convey the most useful information, you can now see summaries, visible right from your email list. So without even opening the email, you’ll know that your team is meeting on Thursday to discuss a new design. And if you jump into a particularly long email when you’re in a hurry, you can tap to reveal a summary at the top of the email and cut right to the chase. We’re also elevating Priority Messages. Apple Intelligence can understand the content of the emails you receive, determine what’s most urgent, and surface it right at the top. Like a dinner invite for tonight, or a boarding pass for your trip this afternoon. And deep understanding of language extends beyond your inbox into more places, like your Notifications. First, just like in Mail, your Priority Notifications appear at the top of the stack, letting you know what to pay attention to at a glance. And to make scanning your notifications faster, they’re summarized. So, when the group chat is blowing up, you can quickly see that Savita booked the house and Lia is arriving early, right from your Lock Screen. Apple Intelligence also enables an all-new focus called Reduce Interruptions. It understands the content of your notifications to selectively surface only the ones that might need immediate attention, like a text about today’s daycare pickup. From catching up on Priority Notifications, to staying present and focused with Reduce Interruptions, and refining your words with Writing Tools, Apple Intelligence helps you save time in so many ways. Now, over to Cyrus to show you how it unlocks new ways to express yourself.
Cyrus Irani: Apple Intelligence enables you to create fun, original images whether you are sprucing up a Keynote for class or trying to land an idea while collaborating in Freeform. And third-party apps can offer this experience too, like in Craft, where you can create a delightful image to add to your document. Let’s take a closer look at how Apple Intelligence helps you express yourself visually in Messages. One of the most fun ways to communicate in Messages is with emoji. But even with thousands of emoji to choose from, there are times when you can’t quite find the right one for how you feel. So, we’re introducing Genmoji. Leveraging the power of Apple Intelligence, you can create Genmoji, on-device, right in the Keyboard, and match any moment perfectly. Just provide a description and you’ll see your Genmoji appear right before your eyes, along with more options to choose from. This is great in those times when you’re updating a friend about your relaxing weekend, getting the group chat excited about brunch, or complaining about the rowdy squirrel right outside your window. And because Apple Intelligence is aware of who’s in your photo library, you can simply pick someone and create a Genmoji that looks just like them! These are perfect for sharing with friends as a sticker, reacting to messages with a Tapback, and you can even add Genmoji inline in your messages! Let your imagination run wild as you create just the right Genmoji! And because it’s so much fun to use images to express ourselves, we went even further with a new system experience we call Image Playground.
This is a new way to create playful images in just seconds. It’s so easy to use, and we’ve built it right into apps like Messages. To get started, you can choose from a range of concepts like themes, costumes, accessories, places, and more. When you select them, they get added to your playground. No need to engineer the perfect prompt. In a few seconds, you’ll see Apple Intelligence creates a preview of what your image could look like. A moment later, you’ll see more previews you can swipe through. This all happens on-device! So, you have the freedom to experiment and create as many images as you want. This is great for quickly responding to your friends with just the right image. When you have a really specific idea in mind, you can just type a description to add it to your playground. And you can easily adjust which style you want to use and choose from Animation, Sketch, or Illustration. Whichever suits the vibe of your conversation. If you change your mind along the way, no problem! Just switch back and you’ll see your previous previews. It’s that simple. Since Apple Intelligence understands your personal context, you’ll see suggestions for concepts related to your Messages conversation, including you and people from your Messages thread. When selected, it uses appearances from Photos to add you, or one of them, to the image you’re creating. [Music] With an intuitive experience to create totally original images, and so many ways to express what you want, the Image Playground is going to make everyday conversations a whole lot more fun. In addition to Messages, this experience is also available in apps like Keynote, Pages, and Freeform. To make it easy to experiment with creating images, we’ve also built a dedicated Image Playground app. You can use it to try out Styles, play around with different concepts, and make something to share with friends in other apps or on social media. And for Developers, they can integrate the new Image Playground experience in their app too, with a new API. With the Image Playground experience and Genmoji, you can create fun and delightful images right where you need them. Now, here’s Seb to show you more experiences enabled by the powerful capabilities of Apple Intelligence.
Sebastien Marineau-Mes: With the ability to deeply understand and create images, Apple Intelligence unlocks some fantastic new experiences. Like, a brand-new tool in the Notes app that we call Image Wand. Image Wand can transform a rough sketch into a polished image that complements your notes and makes them more visual. And it’s available right in your tool palette. Suppose you want a better image for your architectural history course. With Image Wand, you can circle your rough sketch using Apple Pencil to open up an Image Playground within your note. Image Wand uses on-device intelligence to analyze your sketch and words and creates an image for you. What’s really fun is that you can even circle empty space, and it will pull out context from the surrounding area to suggest the ideal image to go with your note. It has never been easier to make your notes more visual and engaging. Apple Intelligence also helps us make the most out of our ever-growing photo libraries. First, we have an update to photo editing. We’ve all had that time when we thought we got the perfect shot, then realized later it wasn’t quite perfect. Now, the new Clean Up tool will identify distracting objects in the background, so you can make them disappear, without accidentally changing your subject. Plus, searching for photos and videos is much more convenient, because you can now use natural language phrases. So, you can search for really specific things, like “Maya skateboarding in a tie-dye shirt,” or “Katie with stickers on her face.” Search in videos is also more powerful, with the ability to find a particular moment in the middle of a video clip. So, you can go right to the relevant segment when you search for that video of Maria cartwheeling on the grass.
Apple Intelligence also makes it so much more delightful to create a Memory Movie. Today, when you want to use your photos and videos to create a movie yourself, like for your fishing trips with your kids, it can take hours of work. You have to search through tons of photos to pick out the best ones, figure out how to arrange them, and hunt for the right music. Now, thanks to Apple Intelligence, it is super easy to create a memory about the story you want to see. Just type a description, and it can interpret that “learning to fish” involves things like water, docks, fishing rods, and boats. Using its language and image understanding, Apple Intelligence picks out the best photos and videos. And then it crafts a storyline with unique chapters that are based on themes identified from your photos, and arranges them into a movie with its own narrative arc. So now I can watch a wonderful Memory that starts with my son practicing on the dock, transitions to fishing on the boat, and finishes with us holding the prize catch. And all of this is set to the perfect song selected from Apple Music. Like all of Apple Intelligence, these updates to Photos are built on a foundation of privacy, so your photos and videos are not shared with Apple, or anyone else. With endless possibilities, it is so much fun trying out different ideas and revisiting our most precious moments. And now, back to Craig.
Craig Federighi: Apple Intelligence is truly unique in how it understands you and meets you where you are. And what you saw here is just the beginning. It enables so many more helpful features. For example, in the Notes app, you can now record and transcribe audio, to capture detailed notes while staying present in the moment. And when your recording is finished, Apple Intelligence generates a summary to help you recall the key points at a glance. Recordings, transcriptions, and Apple Intelligence-powered summaries are also coming to the Phone app. And when you start a recording in a live call, participants are automatically notified, so no one is surprised. Apple Intelligence is available for free with iOS 18, iPadOS 18, and macOS Sequoia, bringing you personal intelligence across the products you use every day. Still, there are other artificial intelligence tools available that can be useful for tasks that draw on broad world knowledge, or offer specialized domain expertise. We want you to be able to use these external models without having to jump between different tools. So, we’re integrating them right into your experiences. And we’re starting out with the best of these, the pioneer and market leader ChatGPT from Open AI, powered by GPT-4o.
First, we built support into Siri, so Siri can tap into ChatGPT’s expertise when it might be helpful for you. For example, if you need menu ideas for an elaborate meal to make for friends using some freshly caught fish and ingredients from your garden, you can just ask Siri. Siri determines that ChatGPT might have good ideas for this, asks your permission to share your question, and presents the answer directly. You can also include photos with your questions. If you want some advice on decorating, you can take a picture and ask, “What kind of plants would go well on this deck?” Siri confirms if it’s okay to share your photo with ChatGPT and brings back relevant suggestions. It’s a seamless integration. In addition to photos, you can also ask questions related to your documents, presentations, or PDFs. We’ve also integrated ChatGPT into the systemwide Writing Tools with Compose. You can create content with ChatGPT for whatever you’re writing about. Suppose you want to create a custom bedtime story for your six-year-old who loves butterflies and solving riddles. Put in your initial idea and send it to ChatGPT to get something back she’ll love. Compose can also help you tap into ChatGPT’s image capabilities to generate images in a wide variety of styles to illustrate your bedtime story. You’ll be able to access ChatGPT for free and without creating an account. Your requests and information will not be logged. And for ChatGPT subscribers, you’ll be able to connect your account and access paid features right within our experiences. Of course, you’re in control over when ChatGPT is used and will be asked before any of your information is shared. ChatGPT integration will be coming to iOS 18, iPadOS 18, and macOS Sequoia later this year. We also intend to add support for other AI models in the future.
Now, let’s talk about developers, and how they can integrate the experiences powered by Apple Intelligence into their apps. We have updated our SDKs with new APIs and frameworks. For example, developers can add the Image Playground experience to their app with just a few lines of code. This means that an app like Craft can help users create images to make their documents much more visual. And Writing Tools are automatically available within apps that use the standard editable text view. So, without any development effort, an app like Bear Notes can automatically allow users to rewrite, proofread, and summarize notes. Plus, we are building many more ways for users to take action in apps with Siri. If a developer has already adopted SiriKit, they’ll see immediate enhancements from many of Siri’s new capabilities without additional work. We’re also investing deeply in the App Intents framework to connect the vast world of apps with Apple Intelligence. We’re defining new intents across our operating systems and making them available to developers starting with these categories. These intents are pre-defined, trained, and tested, so they’re easy for developers to adopt. Using new App Intents, an app like Darkroom will be able to use the Apply Filter intent to give users the ability to say, “Apply a cinematic preset to the photo I took of Ian yesterday.” These are just a handful of the updates coming to our platform SDKs so developers can add intelligent and useful features to their apps. We will share more details in the Platforms State of the Union later today, like how we are bringing generative intelligence to Xcode for developing apps using Swift and SwiftUI, with features like on-device code completion, and smart assistance for Swift coding questions.
So that’s Apple Intelligence, with tremendous benefits for developers and users. This is AI for the rest of us, personal intelligence you can rely on at work, home, and everywhere in between. Apple Intelligence harnesses the power of our most advanced silicon, and will be available on iPhone 15 Pro, and iPad and Mac with M1 and later. Apple Intelligence will be available to try out in US English this summer. We are bringing it to users in beta as part of iOS 18, iPadOS 18, and macOS Sequoia this fall, with some features and additional languages and platforms coming out over the course of the next year. This is the beginning of an exciting new chapter of personal intelligence. Intelligence built for your most personal products: your iPhone, iPad, and Mac. Intelligence grounded in the things that make you, you. And intelligence available to you systemwide, so you can get things done in the way that works for you. We are just getting started, and I hope you are as excited as I am for the road ahead. And now, back to Tim.
Tim Cook: Thank you, Craig, and thanks to all of our presenters. It’s been an exciting day of announcements. We shared powerful new features and advancements to our six incredible platforms. And the introduction of powerful new Apple Intelligence features to iOS 18, iPadOS 18, and macOS Sequoia make these releases game-changers. Built in a uniquely Apple way, we think Apple Intelligence is going to be indispensable to the products that already play such an integral role in our lives. We have a big week ahead for developers. It kicks off this afternoon with the Platforms State of the Union. We also have over a hundred technical sessions, live forums, in-depth consultations, and Q&As with Apple engineers. All of this content is available online, for free, for developers. We’re excited to provide developers with the amazing new OS platforms and technologies we announced today, as well as tools and resources to help them do the very best work of their lives. Thank you so much for joining us. Let’s have a great WWDC!.

Copyright Disclaimer

Under Title 17 U.S.C. Section 107, allowance is made for “fair use” for purposes such as criticism, comment, news reporting, teaching, scholarship, and research. Fair use is permitted by copyright statute that might otherwise be infringing.