What was the first iphone name

Who Invented the iPhone?

Learn How Apple’s First Smartphone Came to Be

According to the «Oxford English Dictionary,» a smartphone is “a mobile phone that performs many of the functions of a computer, typically having a touchscreen interface, internet access, and an operating system capable of running downloaded apps.” As those of you who know your smartphones history are aware, Apple did not invent the smartphone. They did, however, bring us the iconic and much-imitated iPhone, which debuted June 29, 2007.

Precursors to the iPhone

Prior to the iPhone, smartphones were often, bulky, unreliable, and prohibitively expensive. The iPhone was a game-changer. While its technology was state-of-the-art at the time, since more than 200 patents went into its original manufacture, there’s no pinpointing a single person as the iPhone’s inventor. Still, a few names—including Apple designers John Casey and Jonathan Ive—stand out as being instrumental in bringing Steve Jobs’ vision for a touchscreen smartphone to life.

While Apple had produced the Newton MessagePad, a personal digital assistant (PDA) device, from 1993 to 1998, the first concept for a true iPhone-type device came about in 2000 when Apple designer John Casey sent some concept art around via an internal email for something he called the Telipod—a telephone and iPod combination. The Telipod never made it into production but Apple co-founder and CEO Steve Jobs did believe that cell phones with a touchscreen function and access to the Internet were the future of accessible information. Accordingly, Jobs set a team of engineers to tackle the project.

Apple’s First Smartphone

Apple’s first smartphone, the ROKR E1, was released on Sept. 7, 2005. It was the first mobile phone to use iTunes, the music-sharing software Apple had debuted in 2001. However, the ROKR was an Apple and Motorola collaboration, and Apple was not happy with Motorola’s contributions. Within a year, Apple discontinued support for the ROKR. On Jan. 9, 2007, Steve Jobs announced the new iPhone at the Macworld Convention. It went on sale on June 29, 2007.

What Made the iPhone So Special

Apple’s chief design officer from 1992 to 2019, Jonathan Ive, was largely responsible for the look and feel of the iPhone. Born in Britain in February 1967, Ive was also the principal designer of the iMac, the titanium and aluminum PowerBook G4, MacBook, unibody MacBook Pro, iPod, iPhone, and iPad.

The first smartphone with no dedicated keypad for dialing, the iPhone was entirely a touchscreen device that broke new technological ground with its multitouch controls. In addition to being able to use the screen to select and use apps, users could scroll and zoom as well with a finger swipe.

The iPhone also introduced the accelerometer, a motion sensor that allowed the user to turn the phone sideways and have the display automatically rotate to suit. While it was not the first device to have apps or software add-ons, it was the first smartphone to manage the apps market successfully.

The iPhone 4S was released with the addition of a personal assistant called Siri, a voice-controlled, artificial intelligence-based assistant that could not only perform numerous tasks for the user, it could also learn and adapt to better serve that user, as well. With the addition of Siri, the iPhone was no longer a mere phone or music player—it literally put an entire world of information at the user’s fingertips.

Waves of the Future

Since it made its debut, Apple has continued to improve and update the iPhone. The iPhone 10 (also known as iPhone X), released in November 2017, is the first iPhone to use organic light-emitting diode (OLED) screen technology, wireless charging, and facial recognition technology to unlock the phone.

Читайте также:  Не могу соединить через блютуз два айфона

In 2018, Apple released three versions of the iPhone X: iPhone Xs, iPhone X Max (a larger version of the Xs), and the budget-friendly iPhone Xr, all with improved camera technology that enables what Apple terms, «Smart HDR» (high dynamic range) photography. Going forward, Apple is expected to continue with OLED displays for its 2019 devices, and there are some rumors that the company plans to soon retire its earlier LCD (liquid crystal display) displays altogether.

Источник

Who Invented the iPhone?

It all depends on what you mean by “invented”

«data-newsletterpromo_article-image=»https://static.scientificamerican.com/sciam/cache/file/4641809D-B8F1-41A3-9E5A87C21ADB2FD8_source.png»data-newsletterpromo_article-button-text=»Sign Up»data-newsletterpromo_article-button-link=»https://www.scientificamerican.com/page/newsletter-sign-up/?origincode=2018_sciam_ArticlePromo_NewsletterSignUp»name=»articleBody» itemprop=»articleBody»>

The great man theory has crept back into popular culture in recent years, repurposed for the world of entrepreneurs, tech start-ups and digital conglomerates. Elon Musk revolutionized the electric car. Mark Zuckerberg pioneered the social network. Steve Jobs and his team at Apple invented the iPhone.

These heroic narratives are both factually incorrect and unhelpful. In educational terms, a whole generation is growing up on inspirational YouTube videos revering individualism and some troubling leadership traits (see here for the darker side of Jobs and Apple). Yet the challenges the world faces—energy crises, food shortages, climate change, overpopulation—require collaboration and cooperation from all of us, both as global citizens and nations. These challenges are too complex, interconnected and fast-moving to be solved by any one person, idea, organization or nation. We will need to harness the fundamental principle underpinning all research—to stand on the shoulders of giants, with each new breakthrough building on the work of others before it. The hidden story of the iPhone is a testament to this.

The relentless drive and ingenuity of the many teams at Apple cannot be doubted. But there were hundreds of research breakthroughs and innovations without which the iPhone would not even be possible. Each was the result of countless researchers, universities, funders, governments and private companies layering one innovation on top of another.

To demonstrate this, here’s a closer look at just three of the research breakthroughs that underpin the iPhone.

THE TOUCH SCREEN

The iPhone wouldn’t be the iPhone without its iconic touch-screen technology.

The first touch screen was actually invented way back in the 1960s by Eric Arthur Johnson, a radar engineer working at a government research center in the U.K. While the Righteous Brothers were losing that lovin’ feeling, Johnson was publishing his findings in an Electronics Letters article p ublished by th e Institution of Engineering and Technology . His 1965 article, “Touch display—a novel input/output device for computers» continues to be cited by researchers to this day. The 1969 patent that followed has now been cited across a whole host of famous inventions—including Apple’s 1997 patent for “ a portable computer handheld cellular telephone .”

Since Johnson’s first leap forward, billions of dollars have been awarded to research on touch-screen technology—from public bodies and private investors alike, with one often leading to the other. The University of Cambridge, for example, recently spun out a limited company to secure further investment for their own research on touch-screen technology, successfully closing a $5.5m investment round backed by venture capitalists from the U.K. and China.

One Apple patent on touch-screen technology cites over 200 scientific peer-reviewed articles, published by a range of academic societies, commercial publishers and university presses. These authors did not work alone. Most were part of a research group. Many were awarded a grant for their research. Each had their article independently evaluated by at least one external academic in the peer-review process that sits at the core of academic research. Consider one article on touch-screen technology recently published by Elsevier’s Information Sciences journal. Six authors and two blind peer reviewers are acknowledged. Conservatively extrapolating such figures across the two hundred articles cited by Apple tallies to over a thousand researchers, each making their important contribution to this area of touch-screen technology.

Читайте также:  Как отключить темную тему айфон

Johnson may have taken the first step, and Apple harnessed its potential, but we owe touch-screen technology to the collective efforts of numerous researchers all over the world.

THE LITHIUM BATTERY

Battery Low. Blink, blink. We all know iPhones soak up a lot of power, yet they’d be nowhere without the rechargeable lithium battery.

British scientist Stanley Whittingham created the very first example of the lithium battery while working in a lab for ExxonMobil in the ‘70s, carrying forward research he’d initially conducted with colleagues at Stanford University. Previous research had already indicated that lithium could be used to store energy, but it was Whittingham and his team that figured out how to do this at room temperature—without the risk of explosion (Samsung take note).

A professor at the University of Oxford, John Goodenough, then improved on Whittingham’s original work by using metal oxides to enhance performance. This, in turn, piqued Sony’s interest, which became the first company to commercialize lithium batteries in the 1990s and launched a lithium powered cell phone in Japan in 1991. All of this provided the basis for mass use, with Apple duly obliging when they first launched the iPhone to over a million users in 2007.

Lithium’s story doesn’t stop there. As one of the building blocks of a world without fossil fuels, its production is zealously guarded. So who do you think bought Sony’s battery business in 2016? Why, one of Apple’s leading suppliers no less, Murata Manufacturing. Meanwhile, John Goodenough, now 95, continues his groundbreaking research. Only a few months ago he published a landmark study in the Journal of the American Chemical Society. Among its claims? That Goodenough had created a lithium battery for electric cars that can be used 23 times more than the current average.

THE INTERNET AND THE WORLD WIDE WEB

When Apple engineer Andy Grignon first added internet functionality to an iPod in 2004, Steve Jobs was far from enthusiastic: “This is bullshit. I don’t want this. I know it works, I got it, great, thanks, but this is a shitty experience.”

The painstaking work of multiple Apple teams took a “shitty experience” and made something revolutionary—all collective human experience and knowledge right there, in your back pocket, at the touch of your fingertips. But who do we have to thank for this?

Sir Tim Berners-Lee is widely credited with the invention of the World Wide Web. His work began in the 1980s while at the European Organization for Nuclear Research. Better known by its French acronym, CERN was established by 12 European governments in 1952 and continues to be funded by its member states. Berners-Lee’s ideas began as a proposed solution for a very specific problem at CERN: how best to facilitate the sharing and updating of the vast amounts of information and data used by CERN researchers. His proposal was based on the concept of hypertext, a term first coined by the theoretical pioneer Ted Nelson in a 1965 paper published by the Association for Computing Machinery. Often compared to an electronic version of the footnoting system used by researchers the world over, hypertext underpins the web, enabling you to jump from one source of information to another. Anywhere on the Internet. In whatever form it may be.

But even Berners-Lee cannot be given solo credit. If the World Wide Web is the map, the internet is the landscape we navigate: a networking infrastructure connecting millions of computers globally, enabling each to communicate with the other, transferring vast quantities of information.

Читайте также:  Айфон смс звук оригинал

To trace the origins of the internet we have to return to 1965. While Nelson was coining hypertext and Eric inventing the touch screen, two researchers at MIT, Thomas Merrill and Lawrence Roberts, connected their computer to another 3,000 miles away in California using a simple low-speed dial-up telephone line. Shortly after that came Arpanet, not a dystopian AI system, but the Advanced Research Projects Agency Network. Arpanet was established and funded by DARPA, the U.S. Defense Advanced Research Projects Agency, and initially conceived as a means of interconnecting the American military’s computers across their various regional hubs.

It was Arpanet that really gave birth to the internet, in a moment described below by Leonard Kleinrock. It’s October 1969, three months after man has walked on the moon, and Kleinrock and his colleagues have just connected multiple computers across the U.S.:

We typed the L and we asked on the phone,

Do you see the L?

Yes, we see the L

We typed the O, and we asked, Do you see the O?

Yes, we see the O.

Then we typed the G, and the system crashed…

The course of true innovation never did run smoothly. But these early breakthroughs of the space age were the basis for all that was to follow. While the modern iPhone is now 120 million times more powerful than the computers that took Apollo 11 to the moon, its real power lies in its ability to leverage the billions of websites and terabytes that make up the internet.

A brief analysis of these three research breakthroughs reveals a research web of over 400,000 publications since Apple first published their phone patent in 1997. Add the factor of supporting researchers, funders, universities and companies behind them, and the contributing network is simply awe-inspiring. And we’ve barely scratched the surface. There are countless other research breakthroughs without which the iPhone would not be possible. Some well-known, others less so. Both GPS and Siri had their origins with the U.S. military, while the complex algorithms that enable digitization were initially conceived to detect nuclear testing. All had research at their core.

The iPhone is an era-defining technology. Era-defining technologies do not come from the rare brilliance of one person or organization, but layer upon layer of innovation and decade upon decade of research, with thousands of individuals and organizations standing on each other’s shoulders and peering that bit further into the future. In our age of seemingly insurmountable global challenges, we must not only remember this but be inspired by it.

We must encourage openness and transparency at the heart of research, ensuring it is disseminated as widely, quickly and clearly as possible. We must remember that every delay and distortion matters. Research integrity and reproducibility, transparent peer review, open access, diversity—these are more than just buzzwords. They are exciting steps toward reforming the infrastructure of a global research ecosystem that has always been our best hope for the future.

The views expressed are those of the author(s) and are not necessarily those of Scientific American.

ABOUT THE AUTHOR(S)

Matthew Hayes is the author of «Robert Kennedy and the Cuban Missile Crisis,» published in the journal History. He is director of publisher and funder growth at Publons, the world’s largest peer review platform and a part of Clarivate Analytics. He studied history at Oxford University, has a master’s in international relations from SOAS, University of London, and is currently researching a PhD on global citizenship education at the Institute of Education, University College London.

Источник

Оцените статью