vivek chaudhary's broadcasted articles in InoReader
Intel is expecting its next Atom tablet chip, code-named Cherry Trail, to be in devices by the end of this year, the company said this week.
Filmmaker M. Night Shyamalan hasn’t had the best run since captivating audiences with titles like The Sixth Sense and Unbreakable. In fact, the director has become something of a cinematic punching bag as his subsequent films have floundered and faltered at the box office.
Shyamalan understands this, and has since decided that maybe the best way to proceed after the disappointing returns on his summer blockbuster After Earth is to go smaller for his next production. How much smaller? How about a feature film with a cast and crew of 10 people? We're guessing After Earth had more people working just in the craft services department than that.
The director has taken to Twitter to promote his latest effort, a film titled Sundowning. Details about the project are pretty...Read More
Facebook's $19 billion acquisition is winning the messaging wars thanks to an unusual programming language.
How do you support 450 million users with only 32 engineers? For WhatsApp, acquired earlier this week by Facebook, the answer is Erlang, a programming language developed in the '80s that is finally having its moment in the spotlight.
Read Full Story
After years of false starts, there's a real buzz around the future of TV again – and for once it's got nothing to do with 3D.
4K is set to be the next big thing on our goggleboxes, asking us to shell out thousands once more to get the brightest and fanciest pictures ever beamed into our eyes.
But before you start sighing and working out how many months you'll have to save to afford that 60-inch TV to replace the one you only bought last year, we've got some bad news: fresh from announcing its big plans to deliver 4K over the 'net, Netflix is already envisaging the next generation of how we're going to watch TV.
While ultra HD is the thing that's got us all talking, apparently there's a massive change coming that's going to seismically shift the way content is shown and produced; suffice to say, you best get saving for that new 110-inch telly in 2018 as well…
Larger than life
"The next 2-3 years are going to be a fun ride," says Netflix's Neil Hunt, as we meet in a hotel room festooned with images of the streaming service's popular shows and more TVs than seem appropriate.
There's also an embroidered Netflix Quillow (quilt in a pillow) that we're considering stealing, but decide better of it as Netflix's chief product officer warms to his theme.
"The question I have is: how do we get from where we are today, which is fundamentally [viewing] the same picture you've been watching on the big screen for years, and turn that into the next level of evolution?"
It's a tricky question – TV is such a commoditised element of daily life that how do you find new ways to change something most would rather was left alone?
Hunt thinks he has the answer – and it's a necessary one given he thinks 4K isn't going to be enough for some people. "I personally have an expectation that by 2016 we'll have run out of resolution innovation. And the next thing will be 'wide field of view'. You heard it here first.
"Currently the guideline for a TV is that you view it from a distance equivalent to 1.5 times the screen size. What that means is when the director of photography shoots a scene he knows which wide angle lens to put on to make the perspective look elongated from normal.
"But it's entirely based on the physics of how you are from the screen, so if you sit further away from it, everything looks flat. If you sit too close, everything looks wide angle and too far apart."
So what is the answer to solving the distance problem? Hunt thinks it will take a radical re-imagining that will cause nightmares for set designers, producers and directors.
"The next innovation is for a screen that doesn't just fill 30 degrees of your vision but goes to 120 degrees field of view.
"That means a 100 inch widescreen that you view from the same couch distance as today. The interesting angle will be in the centre, but there will be added ambiance for the other two thirds around the edge."
"It's going to be great for nature photography – you'll be able to see the little details and something will tweet over there and you'll be able to actually look and see it."
It seems unlikely that this will even begin in 2016, as it will require a number of huge changes to an industry that's still trying to come to terms with the enhanced clarity 4K will bring. Hunt admits that it will initially require a set of standards to be adhered to, as well as working out how to show 'field of view' content on a regular screen, as just squashing it down will create a crazily distorted image.
And that's before the directors, actors and set makers get their head around the conundrum of working out how to 'channel' viewers' attention to separate parts of the screen – but as Hunt points out "it's something that the Imax guys have mastered, and we'll have to get that into home use."
With the likes of Sony and LG showing off short throw projectors at CES 2014, the ability to have a massive TV in your current lounge suddenly seems much closer to reality.
When is UHD not really UHD?
While most of the TV manufacturers are desperate to convince the watching public that 4K, or Ultra HD, is the next big thing, in truth it's still very much in its infancy – with content pretty hard to come by.
Hunt admits 4K isn't a deal breaker right now, calling it a 'passion project', but believes it's something that Netflix needs to be shown as leading in to dispel the notion that the online service is a low quality streaming platform.
"We're doing [4K streaming] partly for its brand value, to establish ourselves as leader in the space. The key piece for me [with 4K] is the goal here is to be at the forefront of delivering the best possible quality.
"You've to imagine that the number of 4K sets compared to the amount of content we have is not going to deliver much practical value, but it's going to help to cement the Netflix brand, as something that's not scratchy 320p webisodes.
"The nice thing about streaming video is that you can build 4K relatively inexpensively and deliver it to the 0.1% of customers that have the necessary TV.
"If you're trying to do that as a traditional broadcaster, the amount of infrastructure, the servers needed to support the one user from each district that has bought the expensive equipment – even the upgrades to the set top boxes - requires a significant redesign that can't be repurposed. It makes it very expensive."
The next level of content is going to be a hard sell to consumers who are only just converting their media libraries to Blu ray, so Hunt believes online streaming is the answer – it can also solve the 'chicken and egg problem' he describes, whereby cable and satellite broadcasters will be wary of upgrading to this new format without content, and nobody will make it with no easy delivery mechanism.
But Hunt believes that the very title of UHD is a misnomer for 4K resolution, as it will be far more technologically advanced than just more pixels:
"UHD is really three completely different pieces and 4K is just one of those. It's the pixel count on the screen, but it's like digital cameras, which have increased megapixels beyond the point of useful return.
Beyond 4K there's a point of diminishing returns; we've already taken two bites of the apple, with 720 and 1080, this is just another incremental improvement.
"But the other two pieces are the frame rate, which is about a smooth range of motion, and the high dynamic range (HDR) of colour, which is about the depth of each pixel.
"The question is: can we make each pixel stand for a bigger range of brightness and colours, something that makes the picture look amazing?
"Neither frame rate nor HDR are without controversy, but I'm convinced they are the future, the next generation high quality; so for me at least, UHD is high frame rate, 4K and HDR in the colour, we won't see all of that in 2014. Realistically it's probably 4K in 2014, then the other two in 2015 and 2016."
Pressed on these ideas, Hunt admitted that it would be a hard sell to consumers to just describe the changes the likes of improved frame rate and HDR would bring, but believes it's wholly necessary for the industry to embrace them in order to give UHD TVs the jaw-dropping picture that will make buyers clamour to own one – and producers have to be prepared to shoot scenes now that take advantage.
"You won't want to use HDR to just make things brighter," Hunt explained. "It's got to be that 0.1% of the pixels on screen are brighter if the sun is reflecting on the water. The rest of the picture is unchanged, but the viewer thinks 'Wow, that reflection makes it so much more realistic, it's like you're there, it's almost unreal. But that's the thing, it won't seem unreal at all, it's VERY real. It's hyper-real. I think that will be very compelling; we just need to learn how to use it."
When it comes to the future of TV technology, most viewers' excitement will be predicated on a couple of items: does the idea of a massive step forward in immersive viewing fill you with joy, or does it feel like a step too far?
And the more important factor: to achieve these high quality effects, new TV technology will need to be developed, meaning even those that buy a decent 4K set now probably won't get the benefits Hunt is extolling.
But beyond that, fittingly 2020 could be the year that our vision in front of the TV becomes clearer than ever before, with a level of realism that none of us could have foreseen.
As Hunt states, there are a lot of issues over standards and hardware to overcome before this can be a reality – but if the end result is something that is a bigger leap forward than the move from black and white to colour filming, then TV could be set for another renaissance at a time when many are consigning ourselves to a lifetime of watching content on mobiles and tablets.
The year was 1980.
Apple, led by Steve Jobs, introduced the world to the miracle of personal computing. The PC revolution had started. IBM wanted in, but lacked lacked the essential software necessary to deliver a complete consumer product. Enter Microsoft. A young Bill Gates and Paul Allen promised the technology giant that their small start-up had the operating system necessary to power their go-to-market strategy. Minor problem -- they didn't have it.
With a pirate-like flair, the two managed to purchase DOS (disk operating system) from Tim Patterson for the bargain basement price of $50K. The software was quickly repackaged and delivered to IBM as MS-DOS. IBM went on to use the operating system to power their PC line. And the rest, they say, is history. That watershed event set Microsoft up as the de-facto operating system in the nascent PC market. Microsoft went public six years later and is now worth over $300B -- $100B+ more than IBM.
Microsoft's success was, in large part, a result of their relentless focus upon owning the essential platforms upon which software applications were developed and delivered. Using their leverage as a distribution channel for software, the company gained dominance in both the consumer and business application markets. During the dot-com explosion, Microsoft missed a number of plays. Much like today, they had their critics. But at the height of the boom, they leveraged their OS heft to move into the web in a big way. They bundled their newly minted Internet Explorer web browser with Windows -- for free. This move crushed the incumbent Netscape and solidified Microsoft's leadership position as the largest developer platform.
In recent years, it's clear that Microsoft has missed a great number of opportunities. Just as they did during the PC era, Microsoft has its critics. Unlike the PC era, however, the firm has not managed to successfully leverage their platform and resources to secure leadership positions in many key consumer web services such as search, e-commerce, etc. This may be in large part attributed to the massive regulatory hurdles they've faced. That said, Microsoft has managed to maintain their position as the ultimate distribution platform for software, the operating system. Until now.The year was 2007.
Apple, led by Steve Jobs, introduced the world to the miracle of mobile computing. The revolution had started. Hardware manufacturers wanted in, but lacked lacked the essential software necessary to deliver a complete consumer product. Sound familiar? Here's where the story takes a left turn. Enter Google. Taking a play from the Microsoft playbook, Google pushed the Android operating system as a free and open open source solution for hardware manufacturers to go-to-market faster against Apple. With the introduction of tablets and a similar ecosystem dynamic to mobile phones, Google is now on the verge of winning the coveted position that Microsoft has held for the last 30 years -- control of the modern computing platform. And here's the kicker. Just like Microsoft before them, Google didn't build their way to this envious position. They bought it. For how much you ask? $40M.
Had Microsoft become the mobile operating system, they could have potentially recovered from their lack of innovation in the consumer market in recent years. With that distribution channel in place, they would have had the largest reaching mobile app store, the default killer mobile applications like search, could have solidified Internet Explorer in the browser spot and more. Microsoft has since attempted to scramble to win (read: buy) a share of the operating system market. While I hesitate to call the game, it seems that the proverbial horse has left the barn.
Google is now the apex predator of the digital world. Much like the once unstoppable Microsoft, Google is using their platform dominance and vast resources to colonize the consumer and enterprise application stack. And it's working. From advertising to robots, Google is seeking to power all things digital. But, if history has taught us one thing, it's that no empire is safe. From Rome to Microsoft, everyone gets sacked.
Will Google be regulated like Microsoft? Will that provide the opening necessary for one of the newly minted platform giants like Facebook, Twitter, or Dropbox to take the crown? Will Microsoft, led by new chief Satya Nadella, pull a Hail Mary? Or is there another giant brewing in the primordial start-up stew?
Only time will tell. But one thing is for certain, this case will be a classic for MBA students filed under 'don't forget your core competency.'
Samsung's Chromebox hasn't exactly taken the world by storm — either because it's overpriced, underpowered, or simply not the form factor people want for Chrome OS. More likely, it's a combination of all of those factors, and not coincidentally Asus is trying to tack at least two issues in its own Chromebox to be released in March.
The new Asus Chromebox will be priced at just $179, and will also come with a gift from Google. Compared to Samsung's Series 3 Chromebox priced at $329, this box is much cheaper, though it won't have radically better specs. It's a small square box that will run on either a Celeron 2955U or Core i3 processor with integrated Intel HD graphics. Connectivity includes a media card reader, four USB 3.0 ports,...
Microsoft will renounce its "make-them-eat-Metro" strategy in an update for Windows 8.1 slated to ship this spring, if leaked preliminary builds reflect the final product.
It is the size of a postage stamp, but is being described as an invention that could topple titans. Indeed, the titans have all ganged up together to stamp out the little guy who has challenged them, dragging him all the way to the United States Supreme Court, which will hear the case this coming April.
If there's one buzzword that sums up CES 2014, it has to be wearables. There has been everything from cameras to earbuds. Not to mention a veritable deluge of smartwatches, wristbands and fitness trackers. Even Intel is in on the game, bringing us ...
Intel has completed work on a 64-bit version of Android OS for x86 smartphones, and the software will be ready to load on handsets with its upcoming Atom 64-bit chip code-named Merrifield.
As much as I like Google’s Chrome browser for Android, I still can’t fathom why there’s no simple function to show just the content of a web page. Apple’s iOS has this feature, it’s called Reader, and it’s outstanding. While on any web page, you just tap the icon on the left side of your address bar and the page is instantly transformed into only the text and relevant pictures of the page. No ads, no pop-ups, no sidebar images, nothing.
I’ve actually recommended a Chrome extension that does the same — it’s called Evernote Clearly — however, it only works on the desktop version of Chrome. So this past weekend, I had an epiphany: Why not try using a method similar to Clearly but on Android’s mobile version of Chrome? The good news is: I found a way.
I remembered that Readability offers a bookmarklet to do exactly what Clearly does. Bookmarklets are small bits of code that are applied to the currently shown web page, which is the exact situation here: I want to transform the current web page on Chrome for Android so that it just shows text. Here’s how to do that:
- Using Chrome for Android, navigate to the Readability bookmarklets page. You should see three different bookmarklets. Tap and hold the first one called “Read Now”. You should see a pop-up menu of options. Choose the “Copy Link Address” option, which will store the bookmarklet code in memory.
- Next, create a bookmark to this page. If you’re not sure how to create a bookmark, just navigate to any web page in Chrome and tap the star icon that appears in the top right menu of Chrome. You should see this screen:
- Edit the Name of the bookmark. It can be anything, but I recommend something that’s quickly accessible from the keyboard. I named mine *read. You’ll see why I chose to start the name with the asterisk sign shortly.
- Click the Save button to save your bookmark in the Mobile Bookmarks section.
That’s it. You should be all set to clean up the clutter from a web page. To test it out, just navigate to any web page that has a decent amount of content; preferably one that also has ads and other distracting bits. Here’s an example of a long, current New York Times article, complete with ads, trending stories and more.
To de-clutter the page, just tap in the address bar of Chrome and the Android keyboard should appear. Swipe your finger from the ?123 key to the asterisk key and Chrome should show the *read bookmark; tap it and Readability will convert the page to this; a much easier page to read.
This should help explain why I chose to name my bookmark *read: It’s a simple shortcut that’s quick and easy to tap on the keyboard.
Related research and analysis from Gigaom Research:
Subscriber content. Sign up for a free trial.
There’s been an uptick in the adoption of dense servers for cloud deployments and Intel hopes to capture a larger share of that market through server chips it will release next year.
Intel hopes to embed Broadwell-DE high-performance server chips onto motherboards. Intel’s first chips with a system-on-a-chip design, Broadwell-DE cannot be slotted, differentiating them from other standard server chips.
”It enables more of a dense board design,” said Shannon Poulin, vice president and general manager of Intel Data Center Marketing Group.
The Broadwell DE chips will ship in the second half of 2014 or early 2015, Poulin said.
To read this article in full or to leave a comment, please click here
Like pretty much everyone else who is a Beyonce fan, I have been fascinated and bamboozled by the surprise debut of Beyonce’s new album on the iTunes store (instead of the usual offline channels) and in the process setting all sorts of records. Being a fan of her music and videos, I thought it was quite a nifty Christmas present and was one of the 830,000-odd people who bought her album — literally within minutes of seeing the tweet from her on Twitter. The “album” and its success prompted some random thoughts in my head.
- The much maligned “album” format has a future in an all-digital world. We just need to reimagine what the album looks and feels like for people to buy it.
- Just like people bought albums despite radio, we are going to buy albums despite Pandora and Spotify, as long as the artists give us a compelling enough reason to buy it.
- From the looks of it, video storytelling –not just the music — has to be a factor in the new concept of “album.”
- The importance of Twitter and other social media services has never been more important in getting the word out for musicians and artists.
Not everyone is a believer in the game-changer tag being attached to Beyonce’s album. In a blog post for The Washington Post, Dominic Basulto wrote:
Is Beyoncé’s new album really a “groundbreaking way to experience music” – as touted by iTunes? The mixing together of musical tracks and videos, that’s been done before — maybe not as artfully and seamlessly as Beyoncé did it — but it’s been done before. The “groundbreaking” part of the visual album is that once you visit iTunes to download the album, you can also listen to a mix of Beyoncé’s favorite songs on iTunes Radio. But even that concept – the curated playlist – has been around for years. What separates Beyoncé from all but a handful of performers in the music industry is her ability to do everything on a massive scale.
All fair points, but I believe the time for this kind of new approach to albums and music is now.
As a society, we have hit a point where we have much wider penetration of broadband — both wired and wireless — and can conveniently afford to consume such massive chunks of digital content. We are social-networked enough to actually forge a (faux) relationship with the artists. We live in the era of instant gratification and music is one of those things that provides instant gratification.
End of Physical Media?
Target might act all tough and not want to sell music from Beyonce, but then we might not really want to wait to go to Target in the future. And you can call me naive, but this is yet another example of internet reinventing the notion of middleman, media and medium.
As I wrote last year:
The unifying fabric behind all these new behaviors is broadband. For the longest time, physical media was the container that moved content. Records became compact discs. Movie film became VHS tapes and then DVD. Books didn’t really change. And neither did newspapers and magazines. They are all mere containers.
It didn’t matter if you read Tom Wolfe in Rolling Stone, Esquire or in form of a book. He created content (art, if you ask me) and the companies packaged and sold it in containers. They used their distribution networks — trucks, newsstand networks and book stores – to get us to pay for Wolfe’s works.
In the post-broadband world, Internet is the truck, and app stores are the newsstand and book store. Result: the slow and steady decay of physical media as a container for content. Sure, today people still have CD players and DVD players, but tomorrow when all our music will be either downloadable or streamed to us on many devices, who needs those CD players? The shift to digital music will increase with network density — that is the number of connected people and connected devices.
The next logical step is that we are going to see a more app-like approach to the album — one that combines music, video and other experiences into an album-app worth paying $15. Lady Gaga, for example, recently released her experimental app that could very well point to the future where her albums are a bouillabaisse of her disparate and vast talents, tied together by her music. She has been always ahead of the curve in understanding technology and popular mass media and she knows how to bend it to her wishes. Her “app” release is a good indication that as a society we now consume “apps” and “apps” are the new containers of media.
Photo courtesy of Beyonce PR
Jay Z & Samsung Fail
I am not sure what the app/album looks like, but I definitely hope it won’t be anything like that privacy disaster of an app (Magna Carta) released by Jay-Z (Beyonce’s hubby) in partnership with Samsung; the very same app that The New York Times’ music critic Jon Pareles lamented by calling it “creepy.” Of course, since that was part of a deal rumored to be worth $20 million between Samsung and Jay Z, one would imagine the fallout was acceptable collateral damage. Still, it turned me off Jay Z and his music and left me with a bitter taste as a fan.
Basulto goes on to make the following point in his post:
Beyoncé’s decision to drop the album without any advance notice – getting an exclusive deal with iTunes in the process – is better viewed as a response to the changing dynamics of the music business. Beyoncé’s latest move is a business tactic, just like her husband Jay Z’s album release earlier this year, in which he released his “Magna Carta… Holy Grail” as part of a promotional stunt with Samsung to boost first-day sales as rapidly as possible. Artists are looking for ways to get paid for the music they create — and to sell as much of it as they used to back in the day.
And again, nothing he says I really disagree with. However I personally think the Beyonce model is a better approach. Beyonce raked in about $13.3 million in album sales in three days — without alienating fans, getting privacy nerds in a tizzy and most importantly, only generating positive buzz. If I was a popular musician, I would take a real hard look at my plans on how to release the next album. This is a good business model, one that can gain traction if Apple can figure out how it can help sell more albums.
The iTunes Story
The last point I wanted to make was about Apple and the role of its iTunes Store. Given all the attention lavished on Apple’s hardware products, the iTunes Store often gets overlooked for the role it plays in paid digital content economy.
And when it does get attention, it isn’t the right kind of attention — rightfully so, because it is an under-leveraged asset that also encapsulates Apple’s inability to adapt to a more data-driven, algorithmic reality of the internet. The iTunes Store — both on apps and music/video front — can be vastly improved, if the company took a step back from its old way of thinking, but I will digress and leave the rant to another day.
Still, as Ryan Aynes, co-founder of Edge Collective, a music marketing firm told USA Today
“It’s a big move for iTunes to get back into the limelight and it helps their revenue. This is a dual play from the artist and the partner.”
Today, with the convergence of music, video and app type experiences, Apple actually has a unique opportunity to help jump start this new emergent ecosystem. Beyonce’s album could just be the spark that makes “albums” interesting all over again.
Beyonce. Photo courtesy of Beyonce PR
Related research and analysis from Gigaom Research:
Subscriber content. Sign up for a free trial.
The globalization of the American movie industry is easily at its most apparent during the holiday movie season. The blockbuster hits that are opening here are, for the most part, opening throughout the world at the exact same time. As I write this, The Hobbit: The Desolation of Smaug is on screens in nearly every country in the world, as is The Hunger Games: Catching Fire, Frozen and Thor: The Dark World. But while those American blockbusters are atop the box office the world over, local films do still have a foothold on some of the big markets. In China (due to overtake the U.S. as the world’s largest film market in the next 10 years), the big hits are the action comedy No Man's Land (an unexpected success, as it spent three years on the shelf due to government...Read More
It's harder in America to come out as an atheist politician than a gay one. Why?
Three and a half years after it aired, can a Lost fan finally make his peace with the finale? Er, read on...
Warning: contains spoilers for - you guessed it - the Lost finale.
Like millions of others, I watched Lost with a sharply escalating sense of fascination and frustration, and I can’t imagine I’m alone in having felt more of the latter than the former. Over the course of six sensationally strange seasons we were treated to polar bears in paradise, malevolent smoke beasts, dead men walking, nuclear bombs, weird samurais, exploding freighters, a generous dollop of time travel and a wee visit to a parallel universe, to name-check but a few of Lost’s more outlandish narrative flourishes. Nothing wrong with all that. I didn’t tune in expecting gritty realism; I knew that Lost wasn’t The Wire Does Hawaii.
Nevertheless, as I prepared to enjoy the culmination of this bat-shit crazy journey across time, space and Jim Robinson’s career, I asked myself a couple of important questions: ultimately, how did each advancing plot point and seemingly arbitrary, out-of-the-blue cliff-hanger help with our understanding of what the island, and the series itself, was about?; and, after six seasons of evasive - though engaging - story-telling, with little in the way of narrative resolution, would the writers tie up all of the plot strands in a neat little bow, or would they leave their beloved fans hanging?
My main fear was that the writers themselves never knew what the hell was going on, and had been content to roll along, episode to episode, season to season, chucking out twists hither and thither without any creative masterplan to guide them. I could imagine the scene in the writers’ room: ‘Hey, this new twist’ll be cool. It makes absolutely no sense whatsoever, and I don’t know how we’ll write ourselves out of it, but people will be surprised, and that’s the main thing, isn’t it? If viewers start to question how ridiculous it is we’ll just come up with something even more messed up and unbelievable to distract them from the first thing, and then repeat that formula until we get cancelled, or we all just decide to violently murder each other using ball-point pens.’
My second fear was tied to the first: that because of this anarchic approach to story-telling the writers would reach the last episode and opt for a cop-out, leaving a million unanswered questions, and eschewing rational explanation for some sort of mystical, religious piffle, just like the writers of Battlestar Galactica did with their finale.
So what can I say? Were my fears founded?
To accentuate the positives – because there were definitely positives – the finale was emotionally strong. We learned that the off-island action in the final season wasn’t a flash-sideways into an alternate reality, but a flash-forwards into the afterlife. The conceit was an effective one, and certainly kept me guessing until the final reveal. I was expecting the two realities to come clashing together, never suspecting for a moment that I was dealing not with quantum mechanics and string theory, but with good and evil, angels and demons, and the incontrovertible existence of the ‘other side’. I suppose the main reason I didn’t expect this ending was because I didn’t want to believe for a second that my approximately 100 hours of loyal viewership would be rewarded with what in narrative terms felt like callous disregard.
The episode approached its end with most of the main characters sitting in a church in the ‘flash-sideways’ reality, as they prepared to go towards the light. I must admit that the events leading up to this denouement were touching. The islanders, living in a world in which their plane had never crashed - and in which the island itself had sunk in the 1970s - were brought together by fate, and Desmond, and one by one realised the truth of where they really were, who they really were, and what had happened to them on the island (their lives before they’d died). The revelation of their deaths hit each of them like a thunderbolt: they recognised and remembered; romances were rekindled; lovers separated by death were reunited; friendships were re-forged and regrets buried. John Locke walked again, and forgave Ben. Ben stayed out of the church, not quite ready for Heaven. Kate told Jack that she loved him. Sayid and Sawyer got their women back. Aw.
Meanwhile, the action on the island – the real-time, real world – centred on Jack’s newfound acceptance of his role as the island’s protector, and his efforts to stop the smoke monster from using Desmond to rip a plug out of the island’s heart. He was too late. Desmond yanked the plug – surviving the experience thanks to his imperviousness to electro-magnetic energy – causing the island to start disintegrating. Don’t you just hate it when that happens?
Fortunately, though, this process also robbed the smoke monster of his God-like powers, making it a cinch for Jack to kill him. Jack then sacrificed himself to save the island, and his friends, by slotting the plug back into the island’s heart. The series ended with a dying Jack lying on the jungle floor, staring heaven-ward, and finally at peace. We knew where he was headed, so weren’t sad for him: we were happy. He was going to be with the people he loved. Aint that sweet?
As endings go it was pretty insidious. At first I was moved, and let those feelings over-ride my judgement. Minutes later, my brain kicked back into gear and I felt cheated. Cheated out of six years of my life. I wanted to kill the writers - and myself - for believing even for a second that the ending was going to be a proper and satisfying one.
Desmond said it himself prior to pulling the plug: ‘You do realise that whether I pull this plug out and the island is destroyed, or whether you defeat Locke and save the island, it doesn’t really matter.’ No, you’re right, Desmond. But neither did anything else matter. ‘Whatever happened, happened,’ became the show’s catchphrase, and that perfectly sums up the attitude of the writers. ‘Yeah, screw it. Who cares? Just throw another curve-ball.’ The writers seemed to have been more interested in creating a disingenuous symmetry between the bits of random shit they made up – a mysterious Lost ‘Da Vinci Code’ in action – than in creating a credible, satisfying plot, or tying up the disparate, pointless strands of it they’d weaved.
Who put the plug in the island? What was the island? Why was Walt so special when the writers forgot about him after season 3? Why was Walt not in the church at the end? Or Michael? Or Mr Eko? Was this some sort of afterlife apartheid?
Why the hell was Widmore so interested in the island in the first place, and what did he ultimately hope to achieve through his murderous machinations? As enjoyable as it was to shout ‘Is Jim Robinson in every American TV show ever made now?’ at my TV screen each time I saw him, what was the point of Widmore and his evil quest? He seemed superfluous by the end.
If the flash-sideways was limbo, or God’s waiting room, why did so many people ‘die’, and why did so much horrible shit happen in it? What was the significance of all the Dharma Initiative stuff, and the button, and the time travel? In the end they were rendered nothing more than flashy devices used to spin out the show and create new, pointless sci-fi avenues for the writers to explore. I love science fiction and fantasy, both on-screen and in print, and as far as I’m concerned the zanier and bolder it is the better – as long as it’s consistent, and plays by the rules it sets for itself.
In a nutshell then? Yes. My fears were founded.
Perhaps I’m being a little too hard on Lost, but only because I feel cheated. The show had such scope, such potential, and to see it all squandered left me feeling hollow. It’s a crying shame, especially since the first few seasons are so exciting, engaging and enthralling. It’s only as the series rolls closer to its hideous, saccharine, nonsensical climax that the rot begins to set in – and even then the show is always beautiful to look at, and mercifully helmed by a cast of incredibly talented actors.
It would be interesting to watch the whole show again from the beginning, armed with the knowledge of its ending: to see just how big a narrative mess it was. But I don’t care enough. It was only the perpetual sense of mystery that kept me watching, and its disappointing resolution that made me sorry I had. There are plenty more box-sets in the sea.
Read Jamie's thoughts on The Sopranos finale, here.
Follow our Twitter feed for faster news and bad jokes right here. And be our Facebook chum here.
A creature towers above a miniature landscape, its upper body stretching beyond wispy clouds. There is commotion underneath, created by the threat of being trampled, but the creature is oblivious. Its eyes trained on something else in the distance, its arms up ready for battle.
This could describe any of the many fantastic battle scenes in Pacific Rim, director Guillermo del Toro's monster bash which heads to Blu-ray November 11, but it's actually a description of The Colossus, a 19th Century painting by Goya.
According to Hal Hickel, animation director on Pacific Rim, the oil painting was one of the surprising inspirations for the look and feel of a movie that at first glance owes more to pixel power than a painter's brush.
"That was first image he showed us, The Colossus by Goya, and the first words del Toro spoke to us about Pacific Rim were 'operatic' and 'poetic'," explains Hickel to TechRadar.
"He told us to forget about the painting's landscape and imagine it was the ocean and the clouds were sea spray and this figure was coming out."
And that was how the scene was set in pre-production for Pacific Rim, a movie with the achingly simple premise: man-made robots fight giant monsters.
Imagining these amphibious creatures (or Kaiju as they are known in the movie, a description ripped from the pages of Japanese storytelling) and giant robots (called Jaegers in Pacific Rim) takes a lot of computational power, VFX wizardry and, well, suspension of disbelief.
But this is something Hickel is well versed in. His CV is a geek tick-list of VFX laden Hollywood movies.
As animation supervisor at ILM (Industrial Light & Magic) his work can be seen in the Star Wars prequels, Super 8 and the movie that began Marvel's cinematic superhero onslaught, Iron Man.
It was Pacific Rim, however, that proved to be the perfect job for Hickel and his animation team.
"It was a dream come true working on this movie. If you could see my office, I am completely surrounded by toy robots and Kaiju.
"We hadn't worked with Del Toro before but we all knew he had a great vocabulary for visual effects."
Before Pacific Rim, del Toro had already showed the world with Pan's Labyrinth and two Hellboy movies he could successfully meld VFX with humour and pathos.
Much like his contemporaries Sam Raimi and Peter Jackson, del Toro came from the practical effects laden world of horror before hitting blockbuster status – a grounding Hickel believes worked well for Pacific Rim.
"The colour, the vibe, del Toro knew exactly the mood to go for with the movie," says Hickel.
"Pacific Rim is a bit like a comic-book film. We were taking on a goofy sub genre but del Toro had the right tone. He didn't want to it to be too adult, he wanted it to be fun."
It's credit to a movie, then, that focuses on lumbering robots that these oversized metal machines close-up looked like practical effects but were practically all CGI. This was due to the 'colossal' scale of the movie del Toro, by way of Goya, wanted to create.
It was a movie that was even too big for the streets of Hong Kong.
"This was definitely a movie that was heavy on CG. Typically you go out and shoot as much as you can in camera but the scale meant we couldn't," notes Hickel.
"While the baby Kaiju scene was typical special effects, the big battles were 100% CG. The action was on such a huge scale. There aren't any boulevards in Hong Kong wide enough for that type of action. There is quite a lot of destruction and water interaction in the movie that we had to incorporate."
Keeping the human element
Being computer generated didn't mean that a human element was lost, though. In fact in creating the movie the practical nature of the creatures harked back to Japan's Godzilla movies of old, where the term Kaiju was coined and many of the monsters were portrayed by men in suits.
"We had a lot of conversations about the 'man in suit' look for the movie. That's what was so great about those Kaiju films. When, say, Godzilla got hit by a rock there is this sort of suit shake – the guy inside the suit actually shakes the suit! It is signature to these movies.
"What we did with our creature effects was pay homage without going all the way."
As for the 250-foot Jaegers, focusing on the smaller details of the suits helped the movie, reckons Hickel, even though Pacific Rim was all about monster bashing on an epic scale.
"It was tough figuring out the physics for the Jaegers. We didn't write tools to exactly replicate what a robot that size would do as it would feel too big, too ponderous, heavy, huge and boring. Some of the size came from editing and clever shot design and camera shots," says Hickel.
"Instead we felt it was important to differentiate the Jaegers. They are are so slow, so we paid attention to pieces that could move without arresting the motion. If you did that stop-start thing it would feel goofy.
"We focused on the smaller incidental details and put little mechanical accents on smaller areas of the Jaeger to help with the scale."
Attention to 3D
On the big screen, 3D helped these little details pop out and Pacific Rim is one of the few movies where this technology works well, despite being an afterthought for the movie. Hickel puts this down to taking time with the conversion.
"Good 3D is all down to attention. As we were rolling into the 3D, John Knoll had already been involved with Star Wars conversion. He knew it was possible to do it well but you have to spend time on it. Fortunately, when it was announced that the movie would be 3D Guillermo became engaged with 3D. Straight away he sought John's eye for the movie.
"The big concern for us was miniaturisation. You shoot these 250-foot tall characters and bad 3D would make them look flat and small. Fortunately del Toro was concerned about this as well and kept an eye on it."YouTube : http://www.youtube.com/watch?v=wyojUV29xuQ
After battling miniaturisation, it's obvious that size does matter for Pacific Rim – which makes the movie's arrival on Blu-ray an interesting one. Will it retain its epic feel on a smaller screen, especially for those who saw the movie on IMAX?
Hickel doesn't think it will be a problem, though he does admit that even though he worked heavily on the movie, he didn't see it on the biggest screen possible.
"I never actually watched Pacific Rim on IMAX, but I am really curious about it in the home. It may not have the same feel but it will definitely be enjoyable," explains Hickel.
"We are in an age where many people are having such big screens in their home and I am dying to see it on Blu-ray.
"It is probably not a good film to watch on your iPhone, though."
Pacific Rim arrives on Blu-ray 3D, Blu-ray and DVD on November 11 from Warner Bros and Legendary Pictures. Own if first on digital download from November 4.
Intel on Wednesday announced the commercial availability of its long-awaited LTE modem targeted at smartphones and tablets – the first step in making it a credible challenger to dominant 4G chipmaker Qualcomm.
This isn’t Intel’s first LTE silicon. It launched an LTE modem last year, but it was a bit of a one-trick pony, supporting 4G networking but none of the 3G and 2G technologies mobile devices need to fall back on when LTE is unavailable. The new XMM 7160 fills that gap, linking to not only 15 global LTE bands but also HSPA and GSM networks. It even supports new voice-over-LTE (VoLTE) services. The radio chip is already supplying the wireless connectivity inside of the European and Asian editions of Samsung’s Galaxy Tab 3 (which also happens to sport an Intel Atom processor), and starting today the modem will begin shipping worldwide to device makers.
This also isn’t Intel’s first trip to mobile broadband rodeo. In the last decade it tried to spearhead WiMAX as the successor to 3G, but the initially promising technology failed to gain traction among global carriers. That sent Intel back to the drawing board. It bought Infineon back in 2010, making it an instant force in the 2G and 3G radio market, but it was still far behind Qualcomm when it came to LTE. Qualcomm modems powered the first LTE smartphones back in 2011, and it’s been on a roll ever since.
“Late”, however, is a relative term, according to Aicha Evans, Intel VP of wireless. Qualcomm may have established early dominance, but Intel has basically built an LTE business from scratch since it acquired Infineon, Evans said. “We did in three and a half years what some couldn’t do in seven years,” Evans said.
Intel hasn’t just built a one-off chip, Evans said. It’s developed a product pipeline, which will produce multiple new modems over the next year, she said. In the first half of 2014, Intel plans to ship a more advanced modem, supporting new LTE-Advanced techniques like carrier aggregation and the TD-LTE networks used by Sprint, China Mobile and many other global carriers. Also in the pipeline for 2014 is a multimode LTE module optimized for tablets and ultrabooks.
Still, Intel hasn’t fully caught up to Qualcomm. The San Diego chipmaker still the only LTE vendor with a fully integrated system-on-chip combining a baseband modem with applications and graphics processors. That kind of one-stop-shop definitely gives Qualcomm an advantage when it comes to mid-range smartphones, Evans said, but Evans pointed out many high-end devices are still designed with discreet modems and processors.
The most important thing for Intel, Evans said, is to first establish credibility in the market with high-end devices like the Galaxy Tab, which will then lead to deals for lower-end devices once its integrated silicon is ready. When will that be? Evans would reveal any specific timeline, but she said Intel would have more to say about an integrated Atom-LTE chip in the coming quarters.
Right now the LTE market is still relatively small compared to the 3G market – it just seems big here in the U.S. because of its fast pace of LTE adoption – so Intel still has a huge opportunity to capture market share. But it’s not the only one chasing Qualcomm’s dust. Broadcom, MediaTek, Marvell and Nvidia all have multimode LTE in their pipelines, and Nvidia has promised to deliver its own Qualcomm killer – a Tegra 4 processor combined with an LTE modem – in the first quarter.
Related research and analysis from Gigaom Research:
Subscriber content. Sign up for a free trial.
How Netflix Saved Itself After the Qwikster Disaster