Whether it’s nervous jitters or a bumpy road, it’s easy to inadvertently ruin a snapshot with blur. But a team at Microsoft Research is using the same motion-detecting tech found in your iPhone to lend an artificially steady hand.
Utilising only “inexpensive gyroscopes and accelerometers”, the Microsoft algorithm precisely measures the small levels of movement that can occur while taking a picture, and then infers what your photo would have looked like, had you been able to keep yourself still. The result is a digitally compensated photo that, as the samples below clearly demonstrate, is markedly improved over the original.
In its current form, the algorithm runs in conjunction with a bulky sensor attachment, but it seems plausible that these sensors could be internalised in future cameras. No word on when we’ll be seeing this technology hit the market, but on behalf of all of us who have unwittingly ruined an adorable photo, the sooner the better. More information and comparison photos can be found at the team’s site. [Microsoft Research via Engadget]