We generally expect computers to give us precise and accurate answers every time, all the time. After all, that's why computers are computers. But as it turns out, if we cut them a little bit of slack in the accuracy department, we can easily make them a thousand times faster.
Being accurate all the time is a lot of work for a computer, even if that's what it's designed from the ground up to do. It requires big circuits using lots of number crunching power. If, on the other hand, you let a computer be just a little less accurate, say 1% less, you can rely on much smaller and more efficient circuitry. And 1% is really not that big of a deal: all that it means is that if you ask a computer to add 100 to 100, every once in a while you'll get a 198 or a 201.
It's true that in some cases, you need to be able to count on your computer hitting 200 bang on the nose whenever you ask it to. But for applications like video editing, where you're dealing with a bazillion tiny pixels that people can barely see, absolute perfection is much less important.
Researchers at MIT have developed a special computer chip designed for video processing that's a lot like that 1,000 core processor we covered last month. MIT's chip, though, does away with communication between all the individual cores. Instead, each core just talks to the cores nearby, in the same way that when you're processing video, each pixel only really cares about the pixels around it. With this architecture, a chip with 1,000 cores is actually a full 1,000 times faster than a chip with just one core, at least when it comes to processing video. It's also good for analyzing protein folding, some specific types of data searches, and other tasks that depend largely on individual interactions between small pieces of data and where slightly fudging things isn't a big deal.
The eventual concept for this mostly accurate hardware is that it would work alongside a conventional processor kinda like a graphics card, taking on specific tasks that it's suited for, like video rendering. You'll have to get used to the fact that your computer isn't perfect anymore, at least on a theoretical level, but for a graphics card that's 1,000 times faster, I'd happily live with the occasional wayward pixel.