Published By: Guardian - Today, Apr 30, 2012
How are we supposed to tell them apart?
Quick, close your eyes for a second and picture the 1920s.
What did you see? If you're anything like me, the projectionist in your head put on a newsreel consisting of black-and-white footage of flappers doing the Charleston, or a queue of men in flat caps patiently waiting for the great depression to kick off in earnest.
And chances are the footage was jittery and slightly speeded-up.
It's a curious testament to the power of moving pictures that you have to strain to remember that in reality, people walked at a normal pace back then.
The population didn't skitter about like restless insects.
If the 20s had actually unfolded at the speed they appear to in archive footage, the decade would've ended early, somewhere in the middle of 1925, thereby causing a five-year "time gap" during which everyone would have to stand perfectly still for fear of creating an "event" that might burst the bubble, sucking in all the neighbouring matter in the universe.
Or something like that.
Ask Doctor Who, he's the expert.
Our perception of eras seems chiefly dependent on the limitations of the technology that records them.
The 20s are speeded up in our heads because the cameras were cranked by hand, creating an unnaturally hasty frame-rate.
The 40s, however, are in part characterised by the crackly analogue sound that accompanies most war footage.
The 50s are a combination of starchy monochrome US shows and lush cinematic Eastmancolor that stretches into the 60s: this is the era of glamour and dreams, and a colour palette Mad Men seeks to emulate.
The 70s have a raw deal: they seem to chiefly exist in the form of grim, murky 16mm news footage of people gazing sullenly at acres of brown wallpaper.
With the sole exception of the Wombles, everyone in 70s footage looks as if they're being held there against their will.
Then in the 80s, our memories are transferred on to video, lending them a shiny, slightly tinny feel.
The analogue video age lasts until roughly the turn of the century, at which point everything starts turning crisp and widescreen.
Around 2005 things start making the transition to HD â€“ and then we get to today, and a weird new trend is emerging.
I first noticed it some time around the Egyptian revolution, when I was suddenly struck by a Sky News report from Cairo that looked almost precisely like a movie.
Not in terms of action (although that helped â€“ there were people rioting on camelback), but in terms of picture quality.
It seemed to be shot using fancy lenses.
The depth of field was different to standard news reports, which traditionally tend to have everything in focus at once, and it appeared to be running at a filmic 24 frames per second.
The end result was that it resembled a sleek advert framing the Arab Spring as a lifestyle choice.
I kept expecting it to cut to a Pepsi Max pack shot.
Since then, I've noticed similarly glossy-looking reports popping up on Newsnight and the like, so it may not be long until this is the norm.
I'm guessing it's a practical decision rather than an artistic one: this is how the new ultra-portable, ultra-useful digital cameras make things look: everything's a teeny bit polished, a teeny bit Instagrammed.
You see it everywhere: even Holby City looks like a flick these days.
The news is just following suit.
And oddly, this coincides with reports that an audience of cult flick buffs reacted badly to test footage from Peter Jackson's forthcoming Hobbit movie.
The Hobbit is shot at 48 frames per second â€“ twice as many frames as standard films.
The studio claims this gives it an unparalleled fluidity.
The viewers complained it was too smooth â€“ like raw video.
Some said it looked like daytime TV.
What they meant, I guess, is that it seemed too "real", and therefore inherently underwhelming.
The traditional cinematic frame rate lends everything a comforting, unreal and faintly velvety feel, whereas the crisper motion of video seems closer to reality, and therefore intrinsically more harsh and pedestrian.
Therefore watching The Hobbit at 48 frames per second might feel like watching an edition of Homes Under the Hammer starring Bilbo Baggins (admittedly, every edition of Homes Under the Hammer features someone who looks like Bilbo Baggins, but you get my drift).
All of which means we may be nearing a frankly baffling position where TV news reports look more like traditional movies, and flicks look more like traditional TV news reports.
It's going to be harder than ever to tell the two apart, especially when you've got crossover stars such as George Clooney or Hugh Grant seamlessly flitting between the two.
Soon the news will be broadcast in 3D and the only way you'll be able to distinguish it from Hollywood cinema is to wait till the end to see if it ends with a CGi bunny in sunglasses dancing to a Black Eyed Peas cover version of a song you used to like, or a harrowing shot of an open grave stuffed with decaying corpses.
Both of which tend to put me off my popcorn.
Thankfully, for the sake of our collective sanity, the star-studded Leveson Inquiry has had the decency to commit to appalling production values.
It's nothing but witnesses burbling away in front of a dull white wall, intercut with one badly framed shot of a lawyer.
It looks like a soap opera shot on a shoestring by a local TV channel in Guernsey circa 1989, and for this alone it should be applauded.
Call me old-fashioned, but I think news should look like news, and hobbits should look like hobbits â€“ and never the twain shall meet.
Charlie Brookerguardian.co.uk Â© 2012 Guardian News and Media Limited or its affiliated companies.
All rights reserved.
| Use of this content is subject to our Terms & Conditions | More Feeds... See Complete Article @ Guardian