Pages

Tuesday, December 10, 2013

Life of technology

Image from: Wikipedia Commons
It seems granted that computers should work with electricity and quietly (ignoring the fan). There was a time, however, “computers” were like your car engines, full of camshafts.

The most famous mechanical computer was perhaps the “differentialmachine” designed by Charles Babbage in 1800s. He is now considered as “father of the computers”. It was very legendary and had been used by the British government to produce astronomical and mathematical tables. The best mechanical computers were perhaps the ones used to calculate bomb trajectory in the World Wall II, after which computers became electrical.

It is hard but interesting to imagine what the world would be like if we are still using mechanical computers. Personal computing may still be possible, but each time you want to calculate something, you may need to hook it up with your car engine as the source for mechanical power. The Silicon Valley won’t be in California, but near some coalmines. Actually it won’t be call Silicon Valley, but maybe steel valley. Internet can also be conceived, however, in the form of railways.

Although an information era supported by mechanism does not sound realistic, it is still very pitiful to see the mechanical computer passing away, however hard Charles Babbage had worked to initiate its birth. Not only mechanical computers, vacuum cubes, for instance, which was the foundation for the first generation of electrical computers, can hardly be seen nowadays either. Even an electrical engineer may not know what a vacuum cube is!

I remember a lecture on magnetic devices about five years ago. The professor expressed pathetically what he had seen in the recent magnetic technology history. The computers has once relied on small magnets (though huge from the viewpoint of nanotechnology) and magnetic strips for both information storage and processing. Yet now, magnets are only used in hard disks, while all other components are electric, including the internal memory. In the 1980s and 1990s, a number of products similar to CD and DVD were developed in Japan. Instead of relying of optical properties, these products used a layer of magnetic material to store information and were read magnetically. However, these magnetic disks had never hit a wider market; they were only sold in Japan. Their existence was hardly know to people outside Japan. Now, they seems to have totally disappeared, losing the whole market to CD, DVD, and Blue-Ray. The magnetic technology had pioneered in many application, but then lost the competition to later players.

This is like the wild nature. Animals struggle to survive. If the evolution speed fails to keep up with the changing environment, the whole species may distinct. Closely related to the technologies are the companies. We no longer have SUN, the once software giant. Also, can anyone recall Palm? It was the first producer of personal digital assistant (PDA) in the world. Recently, Blackberry is at the same situation where Palm had been around 2009.

Image from: dualdflipflop
Perhaps, the most ironic tragedy is about Kodak. Kodak invented digital photography in 1975, but was killed by this technology thirty years later. The company had once secured almost 90% market share of photographic film in the US. People in Kodak were reluctant to develop digital technology for fear that it would hurt the photographic film business. This inevitably happened, but Kodak lost the leading position. It gradually shifted the market away from the highly profitable photographic film, ending Kodak’s life. The technology of photographic film may also be history as well.

If a technology can be thought of as a life, it is a life worth our respect. Although some of them have faded away, there have been an era that belongs to them. There can be a line of technological history, with symbols representing human talent and effort, like the Pyramids and the Great Wall.

The singularity

Image from: Wikipedia Commons
I am thrilled to learn how developed this term is.

The singularity, or specifically the technological singularity, refers to the future when artificial intelligence becomes so supreme beyond ours that we cannot even guess what the world will be like. It is often feared as a disaster and the end of the human era.

During the past century and the beginning of this one, the technology world has been developing exponentially. This is a big thing. Suppose you have 2 dollars this year and 4 dollars next year, if this growth is linear, 30 years later, you will have 60 dollars, while with an exponential growth, you will have a billion.

The most famous example may be Moore’s law, which describes the capability of computer chips doubling every 24 months. Similar to this trend, the global telecommunication capacity doubles every 24 month, the information throughput of computers doubles every 14 months and the information storage capacity doubles about every 40 months. This fuels our economy, and at the same time makes it possible for a super powered man-made god or devil.

According to the singularity hypothesis, there could be a technology explosion because of the exponential growth. Information technology, nanotechnology and genomics may be the frontier. I. J. Good mentioned the “intelligence explosion” as early as 1965. He expected the singularity point to be the time when the machine can improve themselves in a way unforeseen by human. Beyond this point, the self-improvement will happen recursively and soon surpass human brain. This could happen suddenly.

It seems the world is heading to this singularity fast. The Europe has a huge project bringing together major universities and research institutes across Europe to study brain human brain and apply the knowledge to intelligent systems. The want to build highly developed neuromorphic chips and robots. The first generation of neuromorphic is actually already on the horizon from the SyNAPSE project in US. Neuromophic chips may also hit the market soon. On Oct. 8th this year, Matt Grob, CTO of QUALCOMM, announced the company’s development of standardized neuromorphic chips – Zeroth processors – at MIT Technology Review.

At the Singularity summit 2012, there was a survey on when the singularity would happen. The median was 2040. That’s less than 30 years from now. This seems quite optimistic on our ability to develop the technology, but pessimistic on our ability to control technology.

The technology is actually progressing slower than it appears to be. The concept of artificial intelligence has been there for tens of years, but it is not until recently has it reached the daily life, let alone how useless Siri is and that the GPS sometimes lead you to drive against a one-way road. Even in terms of the cutting edge nanotechnology, the quantum physics has been known for almost a century, but its implementations in the latest research devices are still very shallow. No one has ever been able to touch the heart of quantum physics in applications yet. Turning what is known to what is applicable is still a hard job.

Perhaps, some social problems may be more urgent than the singularity. Before the singularity can possibly happen, the unemployment will inevitably rise because the jobs that are once for men will be taken by machines. This may have an effect to slow the technology development down and possibly cause other problems as well. This seems to be a real problem to worry instead of whether human will be slaved by robots.

Nevertheless, the notion from the singularity summit and the Singularity University is quite encouraging: identify and promote the ideas that will make a difference in human future. 

Monday, December 9, 2013

The man-made brain

Image from: Rick Bolin
It seems to me that the most exciting thing is to know what is possible in the near future. I have been interested in artificial intelligence for years, but I was still surprised to get to know that we are now so near to an artificial brain.

IBM and HRL are making a neuromorphic microprocessor as capable as a cat brain by 2016, which will be the outcome of the SyNAPSE project started in 2011.  

This may seem trivial given that IBM Watson, a supercomputer equipped with intelligent software, has already beat human in chess and the quiz show Jeopardy. It is now being promoted to be applied in medical care as well. Nevertheless, the breakthrough of SyNAPSE is to have an intelligent chip instead of a supercomputer. The technology of artificial neural network has long been widely used, but at a huge cost for high performance. IBM Watson is as large as several rooms, and your three meals each day cannot keep it working for even one hour. Realizing neural network at the chip level does make a big different here.

The basic idea is to use a fundamentally different architecture when building the microprocessor. Traditionally all computers follows the Von Neumann structure, where the system has separate units for memory, processing, control and input-output units. Information is processed sequentially. In a neuromorphic chip, however, memory, processing and control units are mixed, and a high level of parallelism is realized. I have thought that this kind of circuit would inevitably require new materials and device characteristics, but I really have overlooked the potential of the current transistor based integrated circuit technology.

IBM is building such processors by implementing neuromorphic algorithms in the hardware design. Many of these algorithms have already been used in software. The “hardware coding” makes the system much more efficient than just running intelligent software in normal computers.

In an introduction video, Dr. John Arthur, one of the researcher from IBM, mentioned “current computers can do a fantastic job in adding numbers, but they do really poorly in recognizing faces where human brain is very good at.” The goal is to build systems that can do recognition tasks automatically.

In the same video, Dr. Horst Simon, the deputy director of Lawrence Berkeley National Lab made a very interesting analogy to the invention of plane to describe this objective. Unlike birds, the planes have rigid wings instead of flapping wings. Dr. Simon said:

“Reorganization computing is exactly at the stage where we are looking at “flapping wings” and “rigid wings”. We don’t want to build a bird; we want to build a device that allows humans to fly. So, we don’t want to build here a human brain; we want to build devices that can help solve the tasks that current computers cannot solve at ease.”

“We don’t want to build a bird; we want to build a device that allows humans to fly.” I found this comment quite inspiring: take whatever reference to help finish the task. I always wonder what Dr. Von Neumann’s reference was when he outlined the architecture that gave birth to our current computers. But now, our brain is the natural and powerful reference.

Monday, December 2, 2013

Money that does not exist

Image from: Antana
The exchange rate of bitcoin over US dollars has reached over 1000:1 this month. You need to spend more than $1000 to buy 1 bitcoin, a virtual money that worth only several dollars one year ago.

Bitcoin is the first currency based on cryptography instead of central authorities, which debuted in January, 2009. Compared with US dollars, there is no agent playing the role of the US government for Bitcoin. Instead, it keeps a public ledger and log known as “blockchain”, which is maintained by “miners” distributed throughout the Internet. The miner solves the complicated mathematic puzzles to verify transactions and add the information to the blockchain. Just like the miners for gold, the miners here get the newly issued bitcoins for reward. A miner can simply be a personal computer. Recently, some machines that are specifically made due to the fast-rising price.

Bitcoin has been called the “gold of tomorrow” by the chief financial officer from the University of Nicosia, the first university that accepts Bitcoin as payment for tuitions. Indeed, Bitcoin is designed to be “gold”. The algorithm it adopted to generate new coins has an upper limit for the total coins available. As time goes on and more bitcoins become available in circulation, the mathematic puzzles in Bitcoin mining will become increasingly difficult. This means the salary for miners will reduce. Hardly any new coins could be generated when the number reaches 21 million, which is the total amount available according to the Bitcoin protocol. With the crazily rising price, bitcoin mining is now like the replay of the golden rush in 1800s.

Gold had been the ultimate money until 100 years ago. In the economic theory, what made gold the favorable money across culture are that first it has a limited amount available and hard to get; second it stores value. Obviously, Bitcoin possesses the first character but it is a serious problem whether it stores any value. It won’t be valuable unless it is widely accepted as the medium for exchange. 

Until recently, the largest market accepting bitcoins has been the blackmarket. A study from Carnegie Mellon University showed about 4.5~9% of all bitcoin transections had been used to buy illegal drugs from Silk Road, the “eBay for drugs” that was shut down by FBI in October this year. Other illegal transactions include gambling and money laundering. Real stores and business groups are now picking up the trend to accept bitcoins, most notably WordPress and Baidu, the searching giant in China. A Bitcoin ATM also became available recently at a coffee shop in Vancouver. Following this link, you will see more available thing to buy with bitcoins.

Nevertheless, the growth of popularity of bitcoin in exchange is much slower than that in investment. The fast-rising price is largely due to speculations instead of transection needs. Another major risk for the price to fall is whether it could become financially legal. Recently, both People’s Bank of China and the Federal Reverse in US stated that they don’t accept bitcoins as currency, nor do they deem bitcoins illegal. “People are free to buy and sell bitcoins as normal electronic goods,” according to Gang Yi, the vice president of People’s Bank of China, “it is very inspiring but we have no plan to admit its legal status as currency.” Yet not all countries are hesitating. There are also countries admitting Bitcoin, like Germany, France and UK, while there are also countries that have banned it, like Thailand.

While bitcoin is at this hot situation with an uncertain future, another alternative has also been released to share the cake, the Litecoin. It seems Bitcoin needs to be prepared for competitors, and perhaps the non-authority based cryptography currency could also compete with dollars in the future.

(Update) Chinese government has banned Bitcoin as currency in financial activities, although Bitcoin markets are still allowed as a normal digital good market.