Who was the first iphone invented by

Who Invented the iPhone?

Learn How Apple’s First Smartphone Came to Be

According to the «Oxford English Dictionary,» a smartphone is “a mobile phone that performs many of the functions of a computer, typically having a touchscreen interface, internet access, and an operating system capable of running downloaded apps.” As those of you who know your smartphones history are aware, Apple did not invent the smartphone. They did, however, bring us the iconic and much-imitated iPhone, which debuted June 29, 2007.

Precursors to the iPhone

Prior to the iPhone, smartphones were often, bulky, unreliable, and prohibitively expensive. The iPhone was a game-changer. While its technology was state-of-the-art at the time, since more than 200 patents went into its original manufacture, there’s no pinpointing a single person as the iPhone’s inventor. Still, a few names—including Apple designers John Casey and Jonathan Ive—stand out as being instrumental in bringing Steve Jobs’ vision for a touchscreen smartphone to life.

While Apple had produced the Newton MessagePad, a personal digital assistant (PDA) device, from 1993 to 1998, the first concept for a true iPhone-type device came about in 2000 when Apple designer John Casey sent some concept art around via an internal email for something he called the Telipod—a telephone and iPod combination. The Telipod never made it into production but Apple co-founder and CEO Steve Jobs did believe that cell phones with a touchscreen function and access to the Internet were the future of accessible information. Accordingly, Jobs set a team of engineers to tackle the project.

Apple’s First Smartphone

Apple’s first smartphone, the ROKR E1, was released on Sept. 7, 2005. It was the first mobile phone to use iTunes, the music-sharing software Apple had debuted in 2001. However, the ROKR was an Apple and Motorola collaboration, and Apple was not happy with Motorola’s contributions. Within a year, Apple discontinued support for the ROKR. On Jan. 9, 2007, Steve Jobs announced the new iPhone at the Macworld Convention. It went on sale on June 29, 2007.

What Made the iPhone So Special

Apple’s chief design officer from 1992 to 2019, Jonathan Ive, was largely responsible for the look and feel of the iPhone. Born in Britain in February 1967, Ive was also the principal designer of the iMac, the titanium and aluminum PowerBook G4, MacBook, unibody MacBook Pro, iPod, iPhone, and iPad.

The first smartphone with no dedicated keypad for dialing, the iPhone was entirely a touchscreen device that broke new technological ground with its multitouch controls. In addition to being able to use the screen to select and use apps, users could scroll and zoom as well with a finger swipe.

The iPhone also introduced the accelerometer, a motion sensor that allowed the user to turn the phone sideways and have the display automatically rotate to suit. While it was not the first device to have apps or software add-ons, it was the first smartphone to manage the apps market successfully.

The iPhone 4S was released with the addition of a personal assistant called Siri, a voice-controlled, artificial intelligence-based assistant that could not only perform numerous tasks for the user, it could also learn and adapt to better serve that user, as well. With the addition of Siri, the iPhone was no longer a mere phone or music player—it literally put an entire world of information at the user’s fingertips.

Waves of the Future

Since it made its debut, Apple has continued to improve and update the iPhone. The iPhone 10 (also known as iPhone X), released in November 2017, is the first iPhone to use organic light-emitting diode (OLED) screen technology, wireless charging, and facial recognition technology to unlock the phone.

In 2018, Apple released three versions of the iPhone X: iPhone Xs, iPhone X Max (a larger version of the Xs), and the budget-friendly iPhone Xr, all with improved camera technology that enables what Apple terms, «Smart HDR» (high dynamic range) photography. Going forward, Apple is expected to continue with OLED displays for its 2019 devices, and there are some rumors that the company plans to soon retire its earlier LCD (liquid crystal display) displays altogether.

Источник

The History of Cellular Phones

In 1947, researchers looked at crude mobile (car) phones and realized that by using small cells (a range of service area) and found that with frequency reuse they could increase the traffic capacity of mobile phones substantially. However, the technology to do so at the time was nonexistent.

Regulation

Then there’s the issue of regulation. A cell phone is a type of two-way radio and anything to do with broadcasting and sending a radio or television message out over the airwaves is under the authority of Federal Communications Commission (FCC) regulation. In 1947, AT&T proposed that the FCC allocate a large number of radio-spectrum frequencies so that widespread mobile telephone service would become feasible, which would also give AT&T an incentive to research the new technology.

The agency’s response? The FCC decided to limit the number of frequencies available in 1947. The limits made only twenty-three phone conversations possible simultaneously in the same service area and gone was the market incentive for research. In a way, we can partially blame the FCC for the gap between the initial concept of cellular service and its availability to the public.

Читайте также:  Роутер для андроид apple

It wasn’t until 1968 that the FCC reconsidered its position, stating that «if the technology to build a better mobile service works, we will increase the frequencies allocation, freeing the airwaves for more mobile phones.» With that, AT&T and Bell Labs proposed a cellular system to the FCC of many small, low-powered, broadcast towers, each covering a “cell” a few miles in radius and collectively covering a larger area. Each tower would use only a few of the total frequencies allocated to the system. And as the phones traveled across the area, calls would be passed from tower to tower.

Dr. Martin Cooper, a former general manager for the systems division at Motorola, is considered the inventor of the first modern portable handset. In fact, Cooper made the first call on a portable cell phone in April 1973 to his rival, Joel Engel, who served as Bell Labs head of research. The phone was a prototype called the DynaTAC and weighed 28 ounces. Bell Laboratories had introduced the idea of cellular communications in 1947 with the police car technology, but it was Motorola that first incorporated the technology into a portable device designed for use outside of automobiles.

By 1977, AT&T and Bell Labs had constructed a prototype cellular system. A year later, public trials of the new system were held in Chicago with over 2,000 customers. In 1979, in a separate venture, the first commercial cellular telephone system began operation in Tokyo. In 1981, Motorola and American Radio telephone started a second U.S. cellular radiotelephone system test in the Washington/Baltimore area. And by 1982, the slow-moving FCC finally authorized commercial cellular service for the USA.

So despite the incredible demand, it took cellular phone service many years to become commercially available in the United States. Consumer demand would soon outstrip the 1982 system standards and by 1987, cellular telephone subscribers exceeded one million with the airways becoming more and more crowded.

There are basically three ways of improving services. Regulators can increase frequencies allocation, existing cells can be split and the technology can be improved. The FCC did not want to hand out any more bandwidth and building or splitting cells would have been expensive as well as add bulk to the network. So to stimulate the growth of new technology, the FCC declared in 1987 that cellular licensees could employ alternative cellular technologies in the 800 MHz band. With that, the cellular industry began to research new transmission technology as an alternative.

Источник

A History of Apple Computers

Easyturn / Getty Images

Before it became one of the wealthiest companies in the world, Apple Inc. was a tiny start-up in Los Altos, California. Co-founders Steve Jobs and Steve Wozniak, both college dropouts, wanted to develop the world’s first user-friendly personal computer. Their work ended up revolutionizing the computer industry and changing the face of consumer technology. Along with tech giants like Microsoft and IBM, Apple helped make computers part of everyday life, ushering in the Digital Revolution and the Information Age.

The Early Years

Apple Inc. — originally known as Apple Computers — began in 1976. Founders Steve Jobs and Steve Wozniak worked out of Jobs’ garage at his home in Los Altos, California. On April 1, 1976, they debuted the Apple 1, a desktop computer that came as a single motherboard, pre-assembled, unlike other personal computers of that era.

The Apple II was introduced about a year later. The upgraded machine included an integrated keyboard and case, along with expansion slots for attaching floppy disk drives and other components. The Apple III was released in 1980, one year before IBM released the IBM Personal Computer. Technical failures and other problems with the machine resulted in recalls and damage to Apple’s reputation.

The first home computer with a GUI, or graphical user interface — an interface that allows users to interact with visual icons — was the Apple Lisa. The very first graphical interface was developed by the Xerox Corporation at its Palo Alto Research Center (PARC) in the 1970s. Steve Jobs visited PARC in 1979 (after buying Xerox stock) and was impressed and highly influenced by the Xerox Alto, the first computer to feature a GUI. This machine, though, was quite large. Jobs adapted the technology for the Apple Lisa, a computer small enough to fit on a desktop.

The Macintosh Computer

In 1984, Apple introduced its most successful product yet — the Macintosh, a personal computer that came with a built-in screen and mouse. The machine featured a GUI, an operating system known as System 1 (the earliest version of Mac OS), and a number of software programs, including the word processor MacWrite and the graphics editor MacPaint. The New York Times said that the Macintosh was the beginning of a «revolution in personal computing.»

In 1985, Jobs was forced out of the company over disagreements with Apple’s CEO, John Scully. He went on to found NeXT Inc., a computer and software company that was later purchased by Apple in 1997.

Over the course of the 1980s, the Macintosh underwent many changes. In 1990, the company introduced three new models — the Macintosh Classic, Macintosh LC, and Macintosh IIsi — all of which were smaller and cheaper than the original computer. A year later Apple released the PowerBook, the earliest version of the company’s laptop computer.

The iMac and the iPod

In 1997, Jobs returned to Apple as the interim CEO, and a year later the company introduced a new personal computer, the iMac. The machine became iconic for its semi-transparent plastic case, which was eventually produced in a variety of colors. The iMac was a strong seller, and Apple quickly went to work developing a suite of digital tools for its users, including the music player iTunes, the video editor iMovie, and the photo editor iPhoto. These were made available as a software bundle known as iLife.

Читайте также:  Модельный номер для айфона

In 2001, Apple released its first version of the iPod, a portable music player that allowed users to store «1000 songs in your pocket.» Later versions included models such as the iPod Shuffle, iPod Nano, and iPod Touch. By 2015, Apple had sold 390 million units.

The iPhone

In 2007, Apple extended its reach into the consumer electronics market with the release of the iPhone, a smartphone that sold over 6 million units. Later models of the iPhone have added a multitude of features, including GPS navigation, Touch ID, and facial recognition, along with the ability to shoot photos and video. In 2017, Apple sold 223 million iPhones, making the device the top-selling tech product of the year.

Источник

Who Invented the iPhone?

It all depends on what you mean by “invented”

«data-newsletterpromo_article-image=»https://static.scientificamerican.com/sciam/cache/file/4641809D-B8F1-41A3-9E5A87C21ADB2FD8_source.png»data-newsletterpromo_article-button-text=»Sign Up»data-newsletterpromo_article-button-link=»https://www.scientificamerican.com/page/newsletter-sign-up/?origincode=2018_sciam_ArticlePromo_NewsletterSignUp»name=»articleBody» itemprop=»articleBody»>

The great man theory has crept back into popular culture in recent years, repurposed for the world of entrepreneurs, tech start-ups and digital conglomerates. Elon Musk revolutionized the electric car. Mark Zuckerberg pioneered the social network. Steve Jobs and his team at Apple invented the iPhone.

These heroic narratives are both factually incorrect and unhelpful. In educational terms, a whole generation is growing up on inspirational YouTube videos revering individualism and some troubling leadership traits (see here for the darker side of Jobs and Apple). Yet the challenges the world faces—energy crises, food shortages, climate change, overpopulation—require collaboration and cooperation from all of us, both as global citizens and nations. These challenges are too complex, interconnected and fast-moving to be solved by any one person, idea, organization or nation. We will need to harness the fundamental principle underpinning all research—to stand on the shoulders of giants, with each new breakthrough building on the work of others before it. The hidden story of the iPhone is a testament to this.

The relentless drive and ingenuity of the many teams at Apple cannot be doubted. But there were hundreds of research breakthroughs and innovations without which the iPhone would not even be possible. Each was the result of countless researchers, universities, funders, governments and private companies layering one innovation on top of another.

To demonstrate this, here’s a closer look at just three of the research breakthroughs that underpin the iPhone.

THE TOUCH SCREEN

The iPhone wouldn’t be the iPhone without its iconic touch-screen technology.

The first touch screen was actually invented way back in the 1960s by Eric Arthur Johnson, a radar engineer working at a government research center in the U.K. While the Righteous Brothers were losing that lovin’ feeling, Johnson was publishing his findings in an Electronics Letters article p ublished by th e Institution of Engineering and Technology . His 1965 article, “Touch display—a novel input/output device for computers» continues to be cited by researchers to this day. The 1969 patent that followed has now been cited across a whole host of famous inventions—including Apple’s 1997 patent for “ a portable computer handheld cellular telephone .”

Since Johnson’s first leap forward, billions of dollars have been awarded to research on touch-screen technology—from public bodies and private investors alike, with one often leading to the other. The University of Cambridge, for example, recently spun out a limited company to secure further investment for their own research on touch-screen technology, successfully closing a $5.5m investment round backed by venture capitalists from the U.K. and China.

One Apple patent on touch-screen technology cites over 200 scientific peer-reviewed articles, published by a range of academic societies, commercial publishers and university presses. These authors did not work alone. Most were part of a research group. Many were awarded a grant for their research. Each had their article independently evaluated by at least one external academic in the peer-review process that sits at the core of academic research. Consider one article on touch-screen technology recently published by Elsevier’s Information Sciences journal. Six authors and two blind peer reviewers are acknowledged. Conservatively extrapolating such figures across the two hundred articles cited by Apple tallies to over a thousand researchers, each making their important contribution to this area of touch-screen technology.

Johnson may have taken the first step, and Apple harnessed its potential, but we owe touch-screen technology to the collective efforts of numerous researchers all over the world.

THE LITHIUM BATTERY

Battery Low. Blink, blink. We all know iPhones soak up a lot of power, yet they’d be nowhere without the rechargeable lithium battery.

British scientist Stanley Whittingham created the very first example of the lithium battery while working in a lab for ExxonMobil in the ‘70s, carrying forward research he’d initially conducted with colleagues at Stanford University. Previous research had already indicated that lithium could be used to store energy, but it was Whittingham and his team that figured out how to do this at room temperature—without the risk of explosion (Samsung take note).

A professor at the University of Oxford, John Goodenough, then improved on Whittingham’s original work by using metal oxides to enhance performance. This, in turn, piqued Sony’s interest, which became the first company to commercialize lithium batteries in the 1990s and launched a lithium powered cell phone in Japan in 1991. All of this provided the basis for mass use, with Apple duly obliging when they first launched the iPhone to over a million users in 2007.

Lithium’s story doesn’t stop there. As one of the building blocks of a world without fossil fuels, its production is zealously guarded. So who do you think bought Sony’s battery business in 2016? Why, one of Apple’s leading suppliers no less, Murata Manufacturing. Meanwhile, John Goodenough, now 95, continues his groundbreaking research. Only a few months ago he published a landmark study in the Journal of the American Chemical Society. Among its claims? That Goodenough had created a lithium battery for electric cars that can be used 23 times more than the current average.

Читайте также:  Телеграмм мессенджер для iphone

THE INTERNET AND THE WORLD WIDE WEB

When Apple engineer Andy Grignon first added internet functionality to an iPod in 2004, Steve Jobs was far from enthusiastic: “This is bullshit. I don’t want this. I know it works, I got it, great, thanks, but this is a shitty experience.”

The painstaking work of multiple Apple teams took a “shitty experience” and made something revolutionary—all collective human experience and knowledge right there, in your back pocket, at the touch of your fingertips. But who do we have to thank for this?

Sir Tim Berners-Lee is widely credited with the invention of the World Wide Web. His work began in the 1980s while at the European Organization for Nuclear Research. Better known by its French acronym, CERN was established by 12 European governments in 1952 and continues to be funded by its member states. Berners-Lee’s ideas began as a proposed solution for a very specific problem at CERN: how best to facilitate the sharing and updating of the vast amounts of information and data used by CERN researchers. His proposal was based on the concept of hypertext, a term first coined by the theoretical pioneer Ted Nelson in a 1965 paper published by the Association for Computing Machinery. Often compared to an electronic version of the footnoting system used by researchers the world over, hypertext underpins the web, enabling you to jump from one source of information to another. Anywhere on the Internet. In whatever form it may be.

But even Berners-Lee cannot be given solo credit. If the World Wide Web is the map, the internet is the landscape we navigate: a networking infrastructure connecting millions of computers globally, enabling each to communicate with the other, transferring vast quantities of information.

To trace the origins of the internet we have to return to 1965. While Nelson was coining hypertext and Eric inventing the touch screen, two researchers at MIT, Thomas Merrill and Lawrence Roberts, connected their computer to another 3,000 miles away in California using a simple low-speed dial-up telephone line. Shortly after that came Arpanet, not a dystopian AI system, but the Advanced Research Projects Agency Network. Arpanet was established and funded by DARPA, the U.S. Defense Advanced Research Projects Agency, and initially conceived as a means of interconnecting the American military’s computers across their various regional hubs.

It was Arpanet that really gave birth to the internet, in a moment described below by Leonard Kleinrock. It’s October 1969, three months after man has walked on the moon, and Kleinrock and his colleagues have just connected multiple computers across the U.S.:

We typed the L and we asked on the phone,

Do you see the L?

Yes, we see the L

We typed the O, and we asked, Do you see the O?

Yes, we see the O.

Then we typed the G, and the system crashed…

The course of true innovation never did run smoothly. But these early breakthroughs of the space age were the basis for all that was to follow. While the modern iPhone is now 120 million times more powerful than the computers that took Apollo 11 to the moon, its real power lies in its ability to leverage the billions of websites and terabytes that make up the internet.

A brief analysis of these three research breakthroughs reveals a research web of over 400,000 publications since Apple first published their phone patent in 1997. Add the factor of supporting researchers, funders, universities and companies behind them, and the contributing network is simply awe-inspiring. And we’ve barely scratched the surface. There are countless other research breakthroughs without which the iPhone would not be possible. Some well-known, others less so. Both GPS and Siri had their origins with the U.S. military, while the complex algorithms that enable digitization were initially conceived to detect nuclear testing. All had research at their core.

The iPhone is an era-defining technology. Era-defining technologies do not come from the rare brilliance of one person or organization, but layer upon layer of innovation and decade upon decade of research, with thousands of individuals and organizations standing on each other’s shoulders and peering that bit further into the future. In our age of seemingly insurmountable global challenges, we must not only remember this but be inspired by it.

We must encourage openness and transparency at the heart of research, ensuring it is disseminated as widely, quickly and clearly as possible. We must remember that every delay and distortion matters. Research integrity and reproducibility, transparent peer review, open access, diversity—these are more than just buzzwords. They are exciting steps toward reforming the infrastructure of a global research ecosystem that has always been our best hope for the future.

The views expressed are those of the author(s) and are not necessarily those of Scientific American.

ABOUT THE AUTHOR(S)

Matthew Hayes is the author of «Robert Kennedy and the Cuban Missile Crisis,» published in the journal History. He is director of publisher and funder growth at Publons, the world’s largest peer review platform and a part of Clarivate Analytics. He studied history at Oxford University, has a master’s in international relations from SOAS, University of London, and is currently researching a PhD on global citizenship education at the Institute of Education, University College London.

Источник

Оцените статью