THERE are happy accidents all the time, but few unexpected discoveries have the potential to influence history like Richard Kaner's. The 55-year-old UCLA chemistry professor had been trying to develop more efficient ways to produce a new carbon-based material called graphene, one of the strongest substances known to man, but he and his assistant, Maher El-Kady, found that when they exposed the material to light in the lab, it transformed into a super-capacitor—in other words, a highly efficient, biodegradable power source capable of charging 30 to 100 times faster than current lithium-ion batteries, juicing up smartphones and, potentially, electric cars in seconds. Although batteries are a few years from the market, the race to harness the full paradigm-shifting potential of Kaner’s discovery is already on. “As people become more familiar with the technology, they’ll find new applications for it,” he says. [Details.com]
The product will unfortunately not be commercialized for some time. The article is very vague, and does not account for any of the larger issues present with using graphene in supercapacitors. Reading the published paper would be best. Also, a supercapacitor is by no means a super battery. Batteries are still needed for their energy density, while supercapacitors are more used for applications which require higher power densities. There are also competing technologies such as psuedocapacitors using nickel oxides.
Don’t get me wrong, the research is great, however I would not expect it to solve our energy storage problems, as this article may lead us to believe.
More about graphene
How Capacitors Work: Capacitors and batteries both store electrical energy, but unlike batteries, capacitors can’t produce new electrons. Capacitors are widely used as parts of electrical circuits in many common electrical devices.
PREVIOUS: Carbon Aerogel Discovery
Games People Play
GRAND Theft Auto V, which puts the player in control of three ne’er-do-wells as they fight, steal, shoot, fly and drive their way through “one last job,” has been a big investment for development company Rockstar, owned by Take-Two Interactive. Rockstar reportedly sank as much as $265 million into the project. This is $100 million more than the blockbuster crime/car-carnage movie “Fast & Furious 6,” and if true, would make “GTA V” the most expensive video game ever made. Other analysts have estimated that the game cost about as much as that film. Either way, it’s clear that the game is setting a new bar for blockbuster entertainment.
“GTA V” also looks likely to become the most profitable game ever. It is slated to sell 24 million copies and rake in over a billion dollars, a figure that rivals successful films like “Skyfall” and “Iron Man 3.” In anticipation of its launch, shares of Take-Two Interactive have surged 60 percent since the start of the year.
There’s no question that the face of gaming has changed. Back in the heady days of acid-wash jeans and gaudy tracksuits, most games were created by small production houses, with programmers filling multiple roles. Even as late as 2002, cult classic videogame “Max Payne” had its titular character modeled and voiced by its writer, Sam Lake. Fast forward to 2013, and the upcoming Quantic Dream title “Beyond: Two Souls” bills Hollywood stars Ellen Page and Willem Dafoe, with a score composed by the renowned Hans Zimmer.
Far from the pubescent, acne-studded nerd world of gaming in the ’90s as portrayed in multiple cringe-worthy movies, most surveys now peg the average gamer as a thirtysomething year-old person (of either sex) who games to relax after juggling work, kids, and other responsibilities. The skateboarding, Mountain Dew drinking gamers of the ’90s have grown up, and gaming has grown up with them.
It’s hard to imagine anyone lining up at midnight for a cartridge of “Missile Command” or “Breakout.” Like it or not, games have evolved into a nuanced, artistically valid and profitable entertainment sector. Check out some of the steps in this evolution in our slideshow of 21 Video Games That Rocked The Industry. […]
- World of Warcraft (over $10B)
- Call of Duty: Black Ops
- Mario Kart for Wii
- Grand Theft Auto 4
- Wii Play
- New Super Mario Bros
- Gran Turismo 3
- Call of Duty: Modern Warfare 2
- The Sims
- Call of Duty: Modern Warfare ($700M)
- Grand Theft Auto 4 ($100M)
- Gran Turismo 5
- Too Human
- Metal Gear Solid 4
- Halo 3
- APB: All Points Bulletin
- LA Noire
- Final Fantasy XII
- Killzone 2 ($45M)
- Launch of Call of Duty: Ghosts + Why is COD So Popular? [vid]
- Batman: Arkham Origins Plays it Safe, But Still Satisfies
- Amnesia: A Machine for Pigs is Light on Challenge, But Big on Tension
Contre Jour is a physics-based puzzle video game for standard web browsers (HTML5), Windows Phone, Android, Apple iOS and Symbian. It was developed in 2011 by Ukrainian developer Mokus and published by Electronic Arts through its label Chillingo. The art for the game Contre Jour was created by artist Mihai Tymoshenko. The soundtrack for Contre Jour was composed by David Ari Leon. The game focuses upon a little blob named Petit, (a reference to Le Petit Prince) whose only means of locomotion is to be manoeuvered around by manipulating his environment through the various areas of the game using the touch screen.
Bell Labs’ Holmdel, New Jersey campus (1962 - 2007) was designed by famed Finnish architect Eero Saarinen [More info here]
Bell Labs’ Murray Hill, NJ location, c.1959 [via]
Bell Labs, Murray Hill [c.1979]
Bell Laboratories (also known as Bell Labs and formerly known as AT&T Bell Laboratories and Bell Telephone Laboratories) is the research and development subsidiary of the French-owned Alcatel-Lucent in Berkeley Heights, New Jersey, United States. It was founded in 1925 and previously was a division of the American Telephone & Telegraph Company (AT&T Corporation), half-owned through its Western Electric manufacturing subsidiary.
Bell Laboratories operates its headquarters at Murray Hill, New Jersey, and has research and development facilities throughout the world. Researchers working at Bell Labs are credited with the development of radio astronomy, the transistor, the laser, the charge-coupled device (CCD), information theory, the UNIX operating system, the C programming language, S programming language and the C++ programming language. Seven Nobel Prizes have been awarded for work completed at Bell Laboratories.
Bell Laboratories, which thrived from the 1920s to the 1980s, was the most innovative and productive institution of the twentieth century. Long before America’s brightest scientific minds began migrating west to Silicon Valley, they flocked to this sylvan campus in the New Jersey suburbs built and funded by AT&T. At its peak, Bell Labs employed nearly fifteen thousand people, twelve hundred of whom had PhDs. Thirteen would go on to win Nobel prizes. It was a citadel of science and scholarship as well as a hotbed of creative thinking. It was, in effect, a factory of ideas whose workings have remained largely hidden until now.
"New York Times Magazine" writer Jon Gertner unveils the unique magic of Bell Labs through the eyes and actions of its scientists. These ingenious, often eccentric men would become revolutionaries, and sometimes legends, whether for inventing radio astronomy in their spare time (and on the company’s dime), riding unicycles through the corridors, or pioneering the principles that propel today’s technology. In these pages, we learn how radar came to be, and lasers, transistors, satellites, mobile phones, and much more.
Even more important, Gertner reveals the forces that set off this explosion of creativity. Bell Labs combined the best aspects of the academic and corporate worlds, hiring the brightest and usually the youngest minds, creating a culture and even an architecture that forced employees in different fields to work together, in virtually complete intellectual freedom, with little pressure to create moneymaking innovations. In Gertner’s portrait, we come to understand why both researchers and business leaders look to Bell Labs as a model and long to incorporate its magic into their own work.
Written with a novelist’s gift for pacing and an ability to convey the thrill of innovation, “The Idea Factory” yields a revelatory take on the business of invention. What are the principles of innovation? How do new technology and new ideas begin? Are some environments more favorable than others? How should they be structured, and how should they be governed? Can strokes of genius be accelerated, replicated, standardized? The history of Bell Labs provides crucial answers that can and should be applied today by anyone who wants to understand where good ideas come from.
John Gertner’s book about this great American institution excels in three ways. Firstly, it describes in detail the genesis of what was then an unlikely research institution. Until then most communication related work was considered to be squarely within the domain of engineering. Bell Labs arose from a need to improve communications technology pioneered by its parent organization AT&T. But the real stroke of genius was to realize the value that basic scientists - mainly physicists and chemists - could bring to this endeavor along with engineers. This was largely the vision of two men - Frank Jewett and Mervin Kelly. Jewett who was the first president of Bell Labs had the foresight to recruit promising young physicists who were proteges of his friend Robert Millikan, a Nobel Prize winning physicist and president of Caltech. Kelly in turn was Millikan’s student and was probably the most important person in the history of the laboratory. It was Kelly who hired the first brilliant breed of physicists and engineers including William Shockley, Walter Brittain, Jim Fisk and Charles Townes and who would set the agenda for future famous discoveries. During World War II Bell gained a reputation for taking on challenging military projects like radar; at the end of the war it handled almost a thousand of these. The war made the benefits of supporting basic science clear. It was Kelly again who realized that the future of innovation lay in electronics. To this end he moved Bell from its initial location in New York City to an expansive wooded field in New Jersey near Murray Hill and recruited even more brilliant physicists, chemists and engineers. This added further fuel to the fire of innovation started in the 1930s, and from then on the laboratory never looked back.
Secondly, Gertner gives a terrific account of the people who populated the buildings in Murray Hill and their discoveries which immortalized the laboratory. Kelly instituted a policy of hiring only the best minds, and it did not matter whether these were drawn from industry, academia or the government. In some cases he would go to great lengths to snare a particularly valuable scientist, offering lucrative financial incentives along with unprecedented freedom to explore ideas. This led to a string of extraordinary discoveries which Gertner describes in rich and accessible detail. One feature of the book that stands out is Gertner’s efforts in describing the actual science instead of skimming over it; for instance he pays due attention to the revolution in materials chemistry that was necessary for designing semiconductor devices. The sheer number of important things Bell scientists discovered or invented beggars belief; even a limited but diverse sampling includes the first transatlantic cable, transistors, UNIX, C++, photovoltaic cells, error-corrected communication, charged-coupled devices and statistical process control that now forms the basis of the six-sigma movement. The scientists were a fascinating, diverse lot and Gertner brings a novelist’s eye in describing them. There was Bill Shockley, the undoubtedly brilliant, troubled, irascible physicist whose sin of competing against his subordinates led to his alienation at the lab. Gertner provides a fast-paced account of those heady days in 1947 when John Bardeen, Brattain and Shockley invented the transistor, the truly world-changing invention that is Bell Labs’s greatest claim to fame. Then there was Claude Shannon, the quiet, eccentric genius who rode his unicycle around the halls and invented information theory which essentially underlies the entire modern digital world. Described also are Arno Penzias and Robert Wilson, whose work with an antenna that was part of the first communications satellite - also built by Bell - led to momentous evidence supporting the Big Bang. The influence of the laboratory was so formative that even the people who left Bell Labs later went on to greatness; several of these such as future energy secretary Steven Chu joined elite academic institutions and won Nobel Prizes (Bardeen won two). It’s quite clear that the cast of characters that passed through the institution will probably never again be concentrated in one place.
But perhaps the most valuable part of the book deals not with the great scientific personalities or their discoveries but with the reasons that made Bell tick. When Kelly moved the lab to Murray Hill, he designed its physical space in ways that would have deep repercussions for productive thought and invention. Most crucially, he interspersed the basic and applied scientists together without any separation. That way even the purest of mathematicians was forced to interact with and learn from the most hands-on engineer. This led to an exceptional cross-fertilization of ideas, an early precursor of what we call multidisciplinary research. Labs and offices were divided by soundproof steel partitions that could be moved to expand and rearrange working spaces. The labs were all lined along a very long, seven-hundred foot corridor where everybody worked with their doors open. This physical layout ensured that when a scientist or engineer walked to the cafeteria, he or she would “pick up ideas like a magnet picks up iron filings”. Other rules only fed the idea factory. For instance you were not supposed to turn away a subordinate if he came to ask you for advice. This led to the greenest of recruits learning at the feet of masters like Bardeen or Shannon. Most importantly, you were free to pursue any idea or research project that you wanted, free to ask anyone for advice, free to be led where the evidence pointed. Of course this extraordinary freedom was made possible by the immense profits generated by the monopolistic AT&T, but the heart of the matter is that Bell’s founders recognized the importance of focusing on long-term goals rather than short-term profits. They did this by gathering bright minds under one roof and giving them the freedom and time to pursue their ideas. And as history makes clear, this policy led not only to fundamental discoveries but to practical inventions greatly benefiting humanity. Perhaps some of today’s profitable companies like Google can lift a page from AT&T and channel more of their profits into basic, broadly defined, curiosity-driven research.
Gertner’s highly readable book leaves us with a key message. As America struggles to stay competitive in science and technology, Bell Labs still provides the best example of what productive industrial research can accomplish. There are many lessons that modern organizations can learn from it. One interesting lesson arising from the cohabitation of research and manufacturing under the same roof is that it might not be healthy beyond a point to isolate one from the other, a caveat that bears directly on current offshoring policies. It is important to have people involved in all aspects of R&D talking to each other. But the greatest message of all from the story of this remarkable institution is simple and should not be lost in this era of short-term profits, layoffs and declining investment in fundamental research: the best way to generate ideas still is to hire the best minds, put them all in one place and give them the freedom, time and money to explore, think and innovate. You will be surprised how much long-term benefit you get from that policy. As they say, mighty trees from little acorns grow, and it’s imperative to nurture those little seeds.
Bell Telephone Laboratories, as my colleagues and I experienced it during the 1960s and 1970s, was a beehive of scientific and technological scurrying. Practitioners within, tethered on long leashes if at all, were earnestly seeking enigmatic solutions to arcane puzzles. What happened there would have baffled millions of telephone subscribers who, knowingly or not, agreeably or not, supported the quiet circus.
For people who believe in science, and who still believe in technology, it was the epitome of free exploration into how the world did, or could, work. For those concerned with tangible results, the verdict, albeit delayed, is indisputable: fiber optics, the transistor, Echo and Telstar, radio astronomy including confirmation of the Big Bang. Advances in metallurgy, computational methods, and all manner of information storage, transmission and processing. Bell Labs truly was a national resource, and for anyone who was there or who cared, its decline is one of the great tragedies of the past half century. […]
My main interest was computers, particularly their use in picture-making. The Labs had a new microfilm printer that exposed letters and vectors on 35 mm film. Some of my friends were soon making simple movies (with terrible vertical jitter because the camera lacked filmgate registration pins).
My own shtick became a sort of greyscale picture made by filling the screen with thousands of different letters chosen for their brightness. I soon wrote a memo to department head Tom Crowley, suggesting the possibility of a “computer language” for making animated movies; his two-part response launched my career in raster graphics: “It sounds rather ambitious, but why don’t you see what you can do?”
Within a year, I had a set of subroutines someone dubbed BEFLIX, acronym for “Bell Flicks,” arguably the first computer language specifically for bitmap movie making. (I have also been called the inventor of the pixel, which is a bit of a reach, though I might claim independent discovery.) […]
[Conventional computer transistor by Intel]
Transistor (def.) - a device composed of semiconductor material that amplifies a signal or opens or closes a circuit. Invented in 1947 at Bell Labs, transistors have become the key ingredient of all digital circuits, including computers. Today’s microprocessors contain tens of millions of microscopic transistors.
Prior to the invention of transistors, digital circuits were composed of vacuum tubes, which had many disadvantages. They were much larger, required more energy, dissipated more heat, and were more prone to failures. It’s safe to say that without the invention of transistors, computing as we know it today would not be possible. [Webopedia]
The inventors were jointly awarded the 1956 Nobel Prize in Physics for their achievement. [Wikipedia]
I have done lots of research about quantum computing and it looks like it will never be a personal computer. It will only be used for research purposes in big companies like NASA because the tiniest vibration will disrupt the alignment of the atoms defeating the purpose of the quantum computer. Also they are way too big and have to operate at almost 0 Kelvin. I’m sounding like Gordon Moore when he said that we will never have personal computers, but hey anything can happen.
How Small is 22 nm? [vid]
- Intel Demonstrated 48 Core Silicon Chip
- From Sand to Circuits: How Silicon Chips Are Made
- Incredible Pics of Early Science Labs