What does apple believe in

Question: Q: Vision and Mission of Apple Inc.

Hi! As a part of my research, I have to study the vision and mission statement of Apple. I cannot find accurate information neither on the web nor on the corporate website. Can some help me? Some one working inside the Apple can help me the most.

Posted on Apr 23, 2020 3:45 PM

Helpful answers

Tim Cook, then Chief Operating Officer, during a conference call with investors in 2009:

«We believe that we are on the face of the earth to make great products and that’s not changing. We are constantly focusing on innovating. We believe in the simple not the complex. We believe that we need to own and control the primary technologies behind the products that we make, and participate only in markets where we can make a significant contribution. We believe in saying no to thousands of projects, so that we can really focus on the few that are truly important and meaningful to us. We believe in deep collaboration and cross-pollination of our groups, which allow us to innovate in a way that others cannot. And frankly, we don’t settle for anything less than excellence in every group in the company, and we have the self-honesty to admit when we’re wrong and the courage to change. And I think regardless of who is in what job those values are so embedded in this company that Apple will do extremely well.»

«Apple strives to bring the best personal computing experience to students, educators, creative professionals, and consumers around the world through its innovative hardware, software, and internet offerings.»

«There are lots of ways to be as a person. And some people express their deep appreciation in different ways. But one of the ways that I believe people express their appreciation to the rest of humanity is to make something wonderful, and put it out there. And you never meet the people, you never shake their hands, you never hear their story or tell yours. But somehow in the act of making something with a great deal of care and love, something’s transmitted there. And it’s a way of expressing to the rest of our species, our deep appreciation. So we need to be true to who we are. And remember what’s really important to us. That’s what’s going to keep Apple, Apple — is if we keep us, us.»

Источник

Here’s why Apple believes it’s an AI leader—and why it says critics have it all wrong

Apple AI chief and ex-Googler John Giannandrea dives into the details with Ars.

Samuel Axon — Aug 6, 2020 11:45 am UTC

reader comments

Share this story

Machine learning (ML) and artificial intelligence (AI) now permeate nearly every feature on the iPhone, but Apple hasn’t been touting these technologies like some of its competitors have. I wanted to understand more about Apple’s approach , so I spent an hour talking with two Apple executives about the company’s strategy—and the privacy implications of all the new features based on AI and ML.

Further Reading

Despite this, Apple has included dedicated hardware for machine learning tasks in most of the devices it ships. Machine intelligence-driven functionality increasingly dominates the keynotes where Apple executives take the stage to introduce new features for iPhones, iPads, or the Apple Watch. The introduction of Macs with Apple silicon later this year will bring many of the same machine intelligence developments to the company’s laptops and desktops, too.

In the wake of the Apple silicon announcement, I spoke at length with John Giannandrea, Apple’s Senior Vice President for Machine Learning and AI Strategy, as well as with Bob Borchers, VP of Product Marketing. They described Apple’s AI philosophy, explained how machine learning drives certain features, and argued passionately for Apple’s on-device AI/ML strategy.

Table of Contents

What is Apple’s AI strategy?

Both Giannandrea and Borchers joined Apple in the past couple of years; each previously worked at Google. Borchers actually rejoined Apple after time away; he was a senior director of marketing for the iPhone until 2009. And Giannandrea’s defection from Google to Apple in 2018 was widely reported; he had been Google’s head of AI and search.

Further Reading

“When I joined Apple, I was already an iPad user, and I loved the Pencil,” Giannandrea (who goes by «J.G.» to colleagues) told me. “So, I would track down the software teams and I would say, ‘Okay, where’s the machine learning team that’s working on handwriting?’ And I couldn’t find it.” It turned out the team he was looking for didn’t exist—a surprise, he said, given that machine learning is one of the best tools available for the feature today.

“I knew that there was so much machine learning that Apple should do that it was surprising that not everything was actually being done. And that has changed dramatically in the last two to three years,” he said. “I really honestly think there’s not a corner of iOS or Apple experiences that will not be transformed by machine learning over the coming few years.»

I asked Giannandrea why he felt Apple was the right place for him. His answer doubled as a succinct summary of the company’s AI strategy:

I think that Apple has always stood for that intersection of creativity and technology. And I think that when you’re thinking about building smart experiences, having vertical integration, all the way down from the applications, to the frameworks, to the silicon, is really essential. I think it’s a journey, and I think that this is the future of the computing devices that we have, is that they be smart, and that, that smart sort of disappear.

Further Reading

Speaking again of the handwriting example, Giannandrea made the case that Apple is best positioned to “lead the industry” in building machine intelligence-driven features and products:

We made the Pencil, we made the iPad, we made the software for both. It’s just unique opportunities to do a really, really good job. What are we doing a really, really good job at? Letting somebody take notes and be productive with their creative thoughts on digital paper. What I’m interested in is seeing these experiences be used at scale in the world.

He contrasted this with Google. «Google is an amazing company, and there’s some really great technologists working there,» he said. «But fundamentally, their business model is different and they’re not known for shipping consumer experiences that are used by hundreds of millions of people.»

Читайте также:  Топ аксессуары для айфона

How does Apple use machine learning today?

Apple has made a habit of crediting machine learning with improving some features in the iPhone, Apple Watch, or iPad in its recent marketing presentations, but it rarely goes into much detail—and most people who buy an iPhone never watched those presentations, anyway. Contrast this with Google, for example, which places AI at the center of much of its messaging to consumers.

While computers can process certain data more quickly or accurately than humans can, they are still ultimately not intelligent. Traditional models of computer programming involve telling the computer what to do at all times, and in advance; if precisely this happens, then do exactly this. But what if something else happens—even a minor variation? Well, programmers can get quite creative and elaborate to define sophisticated behaviors, but the machine is incapable of making judgments of its own.

With machine learning, in addition to telling a computer what to do, programmers give it a data set relevant to the task and a methodology for analyzing that data set. They then give it time to spin its cycles getting more accurate at labeling or interpreting that data over time, based on positive or negative feedback. This allows the machine to algorithmically make informed guesses about data it hasn’t previously encountered, if the new data is similar to that with which it was trained.

When big tech companies talk about artificial intelligence today, they often mean machine learning. Machine learning is a subset of AI. Many lauded gadget features—like image recognition—are driven by a subset of machine learning called «deep» learning.

There are numerous examples of machine learning being used in Apple’s software and devices, most of them new in just the past couple of years.

Machine learning is used to help the iPad’s software distinguish between a user accidentally pressing their palm against the screen while drawing with the Apple Pencil, and an intentional press meant to provide an input. It’s used to monitor users’ usage habits to optimize device battery life and charging, both to improve the time users can spend between charges and to protect the battery’s longterm viability. It’s used to make app recommendations.

Further Reading

Savvy iPhone owners might also notice that machine learning is behind the Photos app’s ability to automatically sort pictures into pre-made galleries, or to accurately give you photos of a friend named Jane when her name is entered into the app’s search field.

In other cases, few users may realize that machine learning is at work. For example, your iPhone may take multiple pictures in rapid succession each time you tap the shutter button. An ML-trained algorithm then analyzes each image and can composite what it deems the best parts of each image into one result.

Phones have long included image signal processors (ISP) for improving the quality of photos digitally and in real time, but Apple accelerated the process in 2018 by making the ISP in the iPhone work closely with the Neural Engine, the company’s recently added machine learning-focused processor.

I asked Giannandrea to name some of the ways that Apple uses machine learning in its recent software and products. He gave a laundry list of examples:

There’s a whole bunch of new experiences that are powered by machine learning. And these are things like language translation, or on-device dictation, or our new features around health, like sleep and hand washing, and stuff we’ve released in the past around heart health and things like this. I think there are increasingly fewer and fewer places in iOS where we’re not using machine learning.

It’s hard to find a part of the experience where you’re not doing some predictive [work]. Like, app predictions, or keyboard predictions, or modern smartphone cameras do a ton of machine learning behind the scenes to figure out what they call «saliency,» which is like, what’s the most important part of the picture? Or, if you imagine doing blurring of the background, you’re doing portrait mode.

All of these things benefit from the core machine learning features that are built into the core Apple platform. So, it’s almost like, «Find me something where we’re not using machine learning.»

Borchers also pointed out accessibility features as important examples. «They are fundamentally made available and possible because of this,» he said. «Things like the sound detection capability, which is game-changing for that particular community, is possible because of the investments over time and the capabilities that are built in.»

Further, you may have noticed Apple’s software and hardware updates over the past couple of years have emphasized augmented reality features. Most of those features are made possible thanks to machine learning. Per Giannandrea:

Machine learning is used a lot in augmented reality. The hard problem there is what’s called SLAM, so Simultaneous Localization And Mapping. So, trying to understand if you have an iPad with a lidar scanner on it and you’re moving around, what does it see? And building up a 3D model of what it’s actually seeing.

That today uses deep learning and you need to be able to do it on-device because you want to be able to do it in real time. It wouldn’t make sense if you’re waving your iPad around and then perhaps having to do that at the data center. So in general I would say the way I think about this is that deep learning in particular is giving us the ability to go from raw data to semantics about that data.

Increasingly, Apple performs machine learning tasks locally on the device, on hardware like the Apple Neural Engine (ANE) or on the company’s custom-designed GPUs (graphics processing units). Giannandrea and Borchers argued that this approach is what makes Apple’s strategy distinct amongst competitors.

Источник

How does Apple technology hold up against NSO spyware?

Researchers have warned that despite its reputation for secure products, Apple’s closed culture and fear of negative press harm its ability to provide security. Composite: Alex Plavevski/EPA

Researchers have warned that despite its reputation for secure products, Apple’s closed culture and fear of negative press harm its ability to provide security. Composite: Alex Plavevski/EPA

Читайте также:  Выход первого iphone год

The iPhone maker says it is keeping pace with malware, but the Pegasus project paints a worrying picture

Last modified on Mon 19 Jul 2021 20.43 BST

It is one of the technological battles of the 21st century – in which every mobile phone user has a stake.

In one corner, Apple, which has more than a billion active iPhones being used across the world. In the other, companies such as Israel’s NSO Group, developing spyware designed to defeat the most sophisticated security and privacy measures.

And while Apple says it is keeping pace with surveillance tools that are used to attack its phones – it boasts of creating “the most secure consumer platform in the world” – research undertaken as part of the Pegasus project paints a more worrying picture.

The malware, it appears, has been one step ahead.

That, at least, is the conclusion of new technical research by Amnesty International, which suggests that even the most up-to-date iPhones running the latest operating system have still been penetrated by NSO Group’s Pegasus spyware.

What is in the Pegasus project data?

What is in the data leak?

The data leak is a list of more than 50,000 phone numbers that, since 2016, are believed to have been selected as those of people of interest by government clients of NSO Group, which sells surveillance software. The data also contains the time and date that numbers were selected, or entered on to a system. Forbidden Stories, a Paris-based nonprofit journalism organisation, and Amnesty International initially had access to the list and shared access with 16 media organisations including the Guardian. More than 80 journalists have worked together over several months as part of the Pegasus project. Amnesty’s Security Lab, a technical partner on the project, did the forensic analyses.

What does the leak indicate?

The consortium believes the data indicates the potential targets NSO’s government clients identified in advance of possible surveillance. While the data is an indication of intent, the presence of a number in the data does not reveal whether there was an attempt to infect the phone with spyware such as Pegasus, the company’s signature surveillance tool, or whether any attempt succeeded. The presence in the data of a very small number of landlines and US numbers, which NSO says are “technically impossible” to access with its tools, reveals some targets were selected by NSO clients even though they could not be infected with Pegasus. However, forensic examinations of a small sample of mobile phones with numbers on the list found tight correlations between the time and date of a number in the data and the start of Pegasus activity – in some cases as little as a few seconds.

What did forensic analysis reveal?

Amnesty examined 67 smartphones where attacks were suspected. Of those, 23 were successfully infected and 14 showed signs of attempted penetration. For the remaining 30, the tests were inconclusive, in several cases because the handsets had been replaced. Fifteen of the phones were Android devices, none of which showed evidence of successful infection. However, unlike iPhones, phones that use Android do not log the kinds of information required for Amnesty’s detective work. Three Android phones showed signs of targeting, such as Pegasus-linked SMS messages.

Amnesty shared “backup copies” of four iPhones with Citizen Lab, a research group at the University of Toronto that specialises in studying Pegasus, which confirmed that they showed signs of Pegasus infection. Citizen Lab also conducted a peer review of Amnesty’s forensic methods, and found them to be sound.

Which NSO clients were selecting numbers?

While the data is organised into clusters, indicative of individual NSO clients, it does not say which NSO client was responsible for selecting any given number. NSO claims to sell its tools to 60 clients in 40 countries, but refuses to identify them. By closely examining the pattern of targeting by individual clients in the leaked data, media partners were able to identify 10 governments believed to be responsible for selecting the targets: Azerbaijan, Bahrain, Kazakhstan, Mexico, Morocco, Rwanda, Saudi Arabia, Hungary, India, and the United Arab Emirates. Citizen Lab has also found evidence of all 10 being clients of NSO.

What does NSO Group say?

You can read NSO Group’s full statement here. The company has always said it does not have access to the data of its customers’ targets. Through its lawyers, NSO said the consortium had made “incorrect assumptions” about which clients use the company’s technology. It said the 50,000 number was “exaggerated” and that the list could not be a list of numbers “targeted by governments using Pegasus”. The lawyers said NSO had reason to believe the list accessed by the consortium “is not a list of numbers targeted by governments using Pegasus, but instead, may be part of a larger list of numbers that might have been used by NSO Group customers for other purposes”. They said it was a list of numbers that anyone could search on an open source system. After further questions, the lawyers said the consortium was basing its findings “on misleading interpretation of leaked data from accessible and overt basic information, such as HLR Lookup services, which have no bearing on the list of the customers’ targets of Pegasus or any other NSO products . we still do not see any correlation of these lists to anything related to use of NSO Group technologies”. Following publication, they explained that they considered a «target» to be a phone that was the subject of a successful or attempted (but failed) infection by Pegasus, and reiterated that the list of 50,000 phones was too large for it to represent «targets» of Pegasus. They said that the fact that a number appeared on the list was in no way indicative of whether it had been selected for surveillance using Pegasus.

What is HLR lookup data?

The term HLR, or home location register, refers to a database that is essential to operating mobile phone networks. Such registers keep records on the networks of phone users and their general locations, along with other identifying information that is used routinely in routing calls and texts. Telecoms and surveillance experts say HLR data can sometimes be used in the early phase of a surveillance attempt, when identifying whether it is possible to connect to a phone. The consortium understands NSO clients have the capability through an interface on the Pegasus system to conduct HLR lookup inquiries. It is unclear whether Pegasus operators are required to conduct HRL lookup inquiries via its interface to use its software; an NSO source stressed its clients may have different reasons – unrelated to Pegasus – for conducting HLR lookups via an NSO system.

Читайте также:  Каким качеством снимает айфон

This has led to some people’s mobiles being turned into portable surveillance devices, giving complete access to numbers, text messages, photos. Everything.

The disclosure points to a problem security researchers have been warning about for years: that despite its reputation for building what is seen by millions of customers as a secure product, some believe Apple’s closed culture and fear of negative press have harmed its ability to provide security for those targeted by governments and criminals.

“Apple’s self-assured hubris is just unparalleled,” said Patrick Wardle, a former NSA employee and founder of the Mac security developer Objective-See. “They basically believe that their way is the best way. And to be fair … the iPhone has had incredible success.

“But you talk to any external security researcher, they’re probably not going to have a lot of great things to say about Apple. Whereas if you talk to security researchers in dealing with, say, Microsoft, they’ve said: ‘We’re gonna put our ego aside, and ultimately realise that the security researchers are reporting vulnerabilities that at the end of the day are benefiting our users, because we’re able to patch them.’ I don’t think Apple has that same mindset.”

The concern about the vulnerability of mobile devices is one aspect highlighted by the Pegasus project, a collaborative journalism investigation coordinated by Forbidden Stories.

Pegasus: the spyware technology that threatens democracy – video

With the technical support of Amnesty International, the project has investigated a leaked list of tens of thousands of mobile phone numbers – linked to both Apple and Android handsets.

While it was only possible to test a fraction of the phones that were listed for potential surveillance, the scale of what appears to have been a pool of possible targets suggests that customers of the world’s most sophisticated spyware company have not been deterred by security advances made by companies such as Apple.

Most experts agree that the iPhone’s greatest vulnerability is also one of its most popular features: iMessage, which Apple announced earlier this year it had sought to bolster. One method the company has used is to create a feature called BlastDoor, which screens suspect messages before they delve too deeply into a phone.

But even those advances have not kept iPhone users safe.

“We have seen Pegasus deployed through iMessage against Apple’s latest version of iOS, so it’s pretty clear that NSO can beat BlastDoor,” said Bill Marczak, a fellow at Citizen Lab, a cybersecurity analysts’ unit based at the University of Toronto. “Of course, developing security features is still important. Each new measure raises the cost to hack devices, which can price out less sophisticated attackers.”

According to Wardle, the security features that Apple boasts about are a double-edged sword. “iMessage is end-to-end encrypted, which means that nobody is going to see you throwing that exploit. From the attacker’s point of view, that’s lovely,” he said.

A similar problem exists on the device: unlike a Mac, or an Android phone, security researchers are denied the ability to see what their devices are actually doing.

“Once an attacker is inside, they, he or she can almost leverage the device’s security against the user,” Wardle said. “So, for example, I have no idea if my iPhone is hacked. My Mac computer on the other hand, I would say, yes, it’s an easier target, but I can look at a list of running processes, I have a firewall product that I can ask what is allowed to talk to the internet.”

That opacity may even undercut Apple’s claim that attacks “often have a short shelf life”. Because researchers find it very difficult to examine the inner workings of an iPhone, “unless the attacker is very unlucky, that implant is going to remain on the device, likely undetected”, Wardle said.

Claudio Guarnieri, the head of Amnesty’s Security Lab, said there was “no doubt” that NSO spyware could infect the most recent version of iOS. While Apple had done a lot of work to improve security, he said, it was natural the company would always fall behind thousands of attackers who were “always a step ahead”.

“There’s always going to be someone who is very talented out there, motivated by the high remuneration they get from finding these [security] issues, working in all possible ways to bypass and find workarounds to these mitigations,” Guarnieri said.

Another Citizen Lab researcher, John Scott-Railton, said it was important for companies such as Apple to defend against threats by “constantly tracking them” and anticipating what might come next. “If you don’t do that, you can’t really build a secure product, because as much as you talk about what potential threats exist against your platform, lots of clever people will find threats that you don’t know [about],” he said.

Even as Apple’s peers in the tech industry have begun to cry foul on advances by companies such as NSO, and have claimed they pose a grave threat to cybersecurity, Apple has largely stayed out of the fray. In a recent court submission filed in support of WhatsApp, the messaging app that is suing NSO Group in California, companies from Microsoft to Cisco created a coalition and filed a statement saying NSO made ordinary people less safe. Apple did not join the submission.

The partners in the Pegasus project put a series of questions to Apple.

In a statement, the iPhone maker said: “Apple unequivocally condemns cyber-attacks against journalists, human rights activists, and others seeking to make the world a better place. For over a decade, Apple has led the industry in security innovation and, as a result, security researchers agree iPhone is the safest, most secure consumer mobile device on the market.”

Apple also said that security was a dynamic field and that its BlastDoor was not the end of its efforts to secure iMessage.

“Attacks like the ones described are highly sophisticated, cost millions of dollars to develop, often have a short shelf life, and are used to target specific individuals,” it said. “While that means they are not a threat to the overwhelming majority of our users, we continue to work tirelessly to defend all our customers, and we are constantly adding new protections for their devices and data.”

The Washington Post reporter Craig Timberg contributed to this report.

Источник

Оцените статью