In our never-ending quest to make the technology we use smaller — I mean, the iPod Nano should really be nanoscopic, right? — we need to keep producing denser and more powerful component parts. One such part? The microchip. Despite the name, they're just not small enough, but they could be soon.
A lot of advancements in integrated circuits — microchips — involve making them more dense, but how small we can get them is currently limited by the teeny tiny copper wiring that runs electrical current throughout a chip's many layers. Problem is, we can make all those copper connections smaller, but as we do that we increase their current densities, which can lead to all sorts of fun complications such as overheating and even melting.
The solution? Ditch the copper altogether. A group of researchers led by Professor John Robertson at the University of Cambridge in the U.K. are looking at how we may be able to integrate carbon nanotubes into microchip structures. The result would be chips that could be built smaller, and even support far higher electric current densities despite being thinner than the copper wires. That's just to start, too. In the future the nanotubes could prove to be way more resilient than copper in microchips.
Sounds good to me. I can always go for a smaller gadget, and I won't be happy until I have to whip out the ol' microscope to find where my phone is at.