Artificial Heart Mac OS
Machine learning (ML) and artificial intelligence (AI) now permeate nearly every feature on the iPhone, but Apple hasn't been touting these technologies like some of its competitors have. I wanted to understand more about Apple's approach , so I spent an hour talking with two Apple executives about the company's strategy—and the privacy implications of all the new features based on AI and ML.
Historically, Apple has not had a public reputation for leading in this area. That's partially because people associate AI with digital assistants, and reviewers frequently call Siri less useful than Google Assistant or Amazon Alexa. And with ML, many tech enthusiasts say that more data means better models—but Apple is not known for data collection in the same way as, say, Google.Despite this, Apple has included dedicated hardware for machine learning tasks in most of the devices it ships. Machine intelligence-driven functionality increasingly dominates the keynotes where Apple executives take the stage to introduce new features for iPhones, iPads, or the Apple Watch. The introduction of Macs with Apple silicon later this year will bring many of the same machine intelligence developments to the company's laptops and desktops, too.
Denton Cooley at the Texas Heart Institute becomes the first heart surgeon to implant an artificial heart in a human subject. The patient lives on the artificial heart, designed by Dr. Domingo Liotta, for 64 hours, but dies 32 hours after transplantation of a donor heart.
- Dec 25, 2013 The World’s First True Artificial Heart Now Beats Inside a 75-Year-Old Patient The two-pound Carmat heart quickens or slows blood flow based on a person’s physical activity.
- AirSim is an open-source, cross platform simulator for drones, cars and more vehicles, built on Unreal Engine with an experimental Unity release in the works. It supports software-in-the-loop simulation with popular flight controllers such as PX4 & ArduPilot and hardware-in-loop with PX4 for physically and visually realistic simulations.
In the wake of the Apple silicon announcement, I spoke at length with John Giannandrea, Apple's Senior Vice President for Machine Learning and AI Strategy, as well as with Bob Borchers, VP of Product Marketing. They described Apple's AI philosophy, explained how machine learning drives certain features, and argued passionately for Apple's on-device AI/ML strategy.
Table of Contents
What is Apple's AI strategy?
Both Giannandrea and Borchers joined Apple in the past couple of years; each previously worked at Google. Borchers actually rejoined Apple after time away; he was a senior director of marketing for the iPhone until 2009. And Giannandrea's defection from Google to Apple in 2018 was widely reported; he had been Google's head of AI and search.
Google and Apple are quite different companies. Google has a reputation for participating in, and in some cases leading, the AI research community, whereas Apple used to do most of its work behind closed doors. That has changed in recent years, as machine learning powers numerous features in Apple's devices and Apple has increased its engagement with the AI community.“When I joined Apple, I was already an iPad user, and I loved the Pencil,” Giannandrea (who goes by 'J.G.' to colleagues) told me. “So, I would track down the software teams and I would say, ‘Okay, where's the machine learning team that's working on handwriting?’ And I couldn't find it.” It turned out the team he was looking for didn’t exist—a surprise, he said, given that machine learning is one of the best tools available for the feature today.
Advertisement“I knew that there was so much machine learning that Apple should do that it was surprising that not everything was actually being done. And that has changed dramatically in the last two to three years,” he said. “I really honestly think there's not a corner of iOS or Apple experiences that will not be transformed by machine learning over the coming few years.'
I asked Giannandrea why he felt Apple was the right place for him. His answer doubled as a succinct summary of the company's AI strategy:
I think that Apple has always stood for that intersection of creativity and technology. And I think that when you're thinking about building smart experiences, having vertical integration, all the way down from the applications, to the frameworks, to the silicon, is really essential... I think it's a journey, and I think that this is the future of the computing devices that we have, is that they be smart, and that, that smart sort of disappear.
Borchers chimed in too, adding, 'This is clearly our approach, with everything that we do, which is, 'Let's focus on what the benefit is, not how you got there.' And in the best cases, it becomes automagic. It disappears... and you just focus on what happened, as opposed to how it happened.'Speaking again of the handwriting example, Giannandrea made the case that Apple is best positioned to “lead the industry” in building machine intelligence-driven features and products:
We made the Pencil, we made the iPad, we made the software for both. It's just unique opportunities to do a really, really good job. What are we doing a really, really good job at? Letting somebody take notes and be productive with their creative thoughts on digital paper. What I'm interested in is seeing these experiences be used at scale in the world.
He contrasted this with Google. 'Google is an amazing company, and there's some really great technologists working there,' he said. 'But fundamentally, their business model is different and they're not known for shipping consumer experiences that are used by hundreds of millions of people.'
How does Apple use machine learning today?
Apple has made a habit of crediting machine learning with improving some features in the iPhone, Apple Watch, or iPad in its recent marketing presentations, but it rarely goes into much detail—and most people who buy an iPhone never watched those presentations, anyway. Contrast this with Google, for example, which places AI at the center of much of its messaging to consumers.
There are numerous examples of machine learning being used in Apple's software and devices, most of them new in just the past couple of years.
Machine learning is used to help the iPad's software distinguish between a user accidentally pressing their palm against the screen while drawing with the Apple Pencil, and an intentional press meant to provide an input. It's used to monitor users' usage habits to optimize device battery life and charging, both to improve the time users can spend between charges and to protect the battery's longterm viability. It's used to make app recommendations.
Advertisement Then there's Siri, which is perhaps the one thing any iPhone user would immediately perceive as artificial intelligence. Machine learning drives several aspects of Siri, from speech recognition to attempts by Siri to offer useful answers.Savvy iPhone owners might also notice that machine learning is behind the Photos app's ability to automatically sort pictures into pre-made galleries, or to accurately give you photos of a friend named Jane when her name is entered into the app's search field.
In other cases, few users may realize that machine learning is at work. For example, your iPhone may take multiple pictures in rapid succession each time you tap the shutter button. An ML-trained algorithm then analyzes each image and can composite what it deems the best parts of each image into one result.
Phones have long included image signal processors (ISP) for improving the quality of photos digitally and in real time, but Apple accelerated the process in 2018 by making the ISP in the iPhone work closely with the Neural Engine, the company's recently added machine learning-focused processor.

I asked Giannandrea to name some of the ways that Apple uses machine learning in its recent software and products. He gave a laundry list of examples:
There's a whole bunch of new experiences that are powered by machine learning. And these are things like language translation, or on-device dictation, or our new features around health, like sleep and hand washing, and stuff we've released in the past around heart health and things like this. I think there are increasingly fewer and fewer places in iOS where we're not using machine learning.
It's hard to find a part of the experience where you're not doing some predictive [work]. Like, app predictions, or keyboard predictions, or modern smartphone cameras do a ton of machine learning behind the scenes to figure out what they call 'saliency,' which is like, what's the most important part of the picture? Or, if you imagine doing blurring of the background, you're doing portrait mode.
All of these things benefit from the core machine learning features that are built into the core Apple platform. So, it's almost like, 'Find me something where we're not using machine learning.'
Borchers also pointed out accessibility features as important examples. 'They are fundamentally made available and possible because of this,' he said. 'Things like the sound detection capability, which is game-changing for that particular community, is possible because of the investments over time and the capabilities that are built in.'
Further, you may have noticed Apple's software and hardware updates over the past couple of years have emphasized augmented reality features. Most of those features are made possible thanks to machine learning. Per Giannandrea:
Machine learning is used a lot in augmented reality. The hard problem there is what's called SLAM, so Simultaneous Localization And Mapping. So, trying to understand if you have an iPad with a lidar scanner on it and you're moving around, what does it see? And building up a 3D model of what it's actually seeing.
That today uses deep learning and you need to be able to do it on-device because you want to be able to do it in real time. It wouldn't make sense if you're waving your iPad around and then perhaps having to do that at the data center. So in general I would say the way I think about this is that deep learning in particular is giving us the ability to go from raw data to semantics about that data.
Increasingly, Apple performs machine learning tasks locally on the device, on hardware like the Apple Neural Engine (ANE) or on the company's custom-designed GPUs (graphics processing units). Giannandrea and Borchers argued that this approach is what makes Apple's strategy distinct amongst competitors.
Scientists were able to find a way to put human urine and other waste products into good use. EcoBots can now use them as energy fuel to work.
Researchers from the Bristol Robotics Laboratory led by Peter Walters devised a mechanical tool which can pump urine into the heart of the environment-friendly and self-sustainable robot called 'EcoBot.' The device works more like the human heart, wherein the robot is able to gather waste products and convert them successfully to electricity.
'The artificial heartbeat is mechanically simpler than a conventional electric motor-driven pump by virtue of the fact that it employs artificial muscle fibres to create the pumping action, rather than an electric motor, which is by comparison a more complex mechanical assembly,' Walters said in a press release.
Mac Os Mojave
Previous to this, the same team has already created four models of the EcoBot. For a decade, these robots were structured to use microbial fuel cells which are used to generate electricity by digesting organic wastes and produce power.
Like the human heart, the new device works by contracting the pump to squeeze the urine out. The liquid then is applied with hot electric current allowing it to be channeled through to a certain high level for it to flow to the robot's various fuel cells.
Mac Os Catalina
Thereafter, the muscles cool down and return to its initial form, made possible by muscle-like alloys which could recall its original shape. The pump then begins to relax and allows it to draw another set of urine fuel to resume another round.
This new discovery is very significant since these EcoBots may be used for monitoring and maintenance in highly dangerous areas where there is extreme pollution and swarming predators. This is possible because little human intervention is required for their continuous functioning capacities.
'We speculate that in the future, urine-powered EcoBots could perform environmental monitoring tasks such as measuring temperature, humidity and air quality. A number of EcoBots could also function as a mobile, distributed sensor network,' said Walters.
Further studies of the research team will be concentrating on enhancing the efficacy of the pump and see how these may be used in future eco-friendly robots.
Their study was published in the Nov. 7 issue of the journal the Bioinspiration and Biomimetics.