22 August 2011

Complete Review : Driver San Francisco


What is it?

The latest game in the movie-inspired Driver series combines Crazy Taxis nosebleed-inducing thrills with Steve McQueen's Bullitt and emerges with a plot that's about as sensible as riding a pogo stick through a minefield. While blindfolded.

 


Sounds intriguing. What's the story then?

You play John Tanner - a cop whose nemesis leaves him in a coma after escaping captivity in an armored police van. It sounds a bit Life on Mars, but the events of the game take place within this unconscious state, with Tanner able to 'Shift' Quantum Leap-style between vehicles and temporarily inhabit the characters driving them.

Ah, the cars. That's what it's all about, right?


It's not called Driver for nothing. There are 120 fully-licensed vehicles that, if you're not careful, are also fully destructible. The in-car views are excellent: Tanner's body is visible at all times, and the model is unusually high-quality. It's satisfying to watch him reach for the hand brake, or to see his palm steering while pulling off evasive manoeuvres.

Here is a video trailer of the Game


What, like 45° hill starts?

Quite. Over 200 miles of San Francisco's streets have been recreated, complete with trams to dodge and notable landmarks such as the Golden Gate Bridge and the super-twisty Lombard Street to watch out for. Parts of the city have been changed to make them more drive-able, but if you're familiar with San Fran you'll feel at home - except for the fact that you're driving fast

Some Screenshots 






I'm done sightseeing. Now what?

Apart from the main missions there are bonus tasks, 19 multiplayer modes and unlock-able set pieces inspired by some of Hollywood's best-known car chases.

 

21 August 2011

Near Field Communication


It's the future of money, of sharing and of ticketing - and here's how it works



What is NFC?

Near Field Communication is a secure, short-range wireless technology that allows information to be exchanged between devices at very short distances. It's a subset of RFID (Radio Frequency Identification), but differs by only working at distances of about 4cm or less.


 
Why does it matter?

NFC's ability to securely communicate with both powered and 'passive' targets makes it the perfect tech for turning your phone into a 21st century wallet, and more. Orange and Barclaycard's 'Quick Tap' system (see right) could be the beginning of the end for debit cards; a trial in New York recently let subway passengers use their NFC phones as travel cards, the next version of Android will allow users to share all kinds of data with a touch and the tech can also read smart posters such as Google's Hotpot' stickers. It's the future - just don't lose your mobile.

How it works    

When the NFC chip and its antenna loop pass into the terminal's magnetic field, an ' electric current is generated. This process, as physics boffins will know, is called magnetic induction'. The current then jumps between the antenna coils in both the phone and terminal, creating a secure link that allows them to communicate via short-range radio waves.

Anatomy of a 'Quick Tap' Purchase


1. First, link a Samsung Tocco Quick Tap (£60 PAYG, shop.orange.co.uk) to your Barclaycard, Barclays account or Orange PAYG credit at barclaycard.co.uk. You can transfer up to £150 (or a maximum of £100 in one go) to the phone via the Quick Tap app.

2. After informing the cashier that you wish to pay by Quick Tap -and allowing time for him to call his line manager to explain the system to him -touch your phone on the shop's contact-less terminal (known in NFC lingo as a'tag'). This is where the magic begins...

3. Thanks to the power of magnetic induction (see previous page), your phone and the terminal now have a secure, short-range connection. Providing your PIN number is optional at the moment as transactions are limited to £15, but when this rises you'll need to have tapped your PIN into the phone prior to touching the contact-less terminal.

4. Once the NFC connection is made your transaction info is whizzed over to Barclaycard's servers, coded in the same cryptographic algorithms used for Chip and PIN cards. Each code is specific to your purchase, so it can't be cloned for nefarious purposes.

5. Payment is authorized (or not) via the current payment network system, so mobile reception isn't required. The Tocco Quick Tap and the terminal confirm that payment was successful, and your phone gets your account balance from Barclaycard via the Orange mobile network. You leave the shop feeling more futuristic than Marty Mcfly in Hill Valley circa 2015.

Watch this video :



Share your experience of NFC in the comments below !


17 August 2011

Turn Your PC into a Tablet



Don't tell your iPhone-owning, iPhone-loving friends, but they're not relevant anymore. No more do you have to put up with them flashing their phone around down the pub proclaiming: "This app is hilarious". No, no it's not, it's a grown man pretending his phone is a pint of beer, or worse, making fart noises by pressing a big red button. Genius, that's exactly why humanity spent a century discovering electricity, semiconductors and million-line operating systems that can run on a 500mW processor. Ranting aside, Google Android has done an impressive job of catching up with Apple, perhaps not so much on the consistency or slickness of the interface, but most definitely in the functionality, flexibility and features that it offers.

Another area that Google excels in, much like Microsoft, is wooing and caring for developers. The Android SDK is freely available to download for both Windows and Linux platforms. This opens up some interesting features for everyday folk, not just developers. One is running Android on your desktop PC in the form of an Android emulator. It's designed so you can test out your newly compiled apps without needing an actual device, but it gives you a chance to play around with a functioning Android phone or tablet running any version of the OS from 1.5 through to 3.0. There are even manufacturer specific builds available, such as the Samsung Galaxy Tab and various Sony Ericsson phones.

Firing up the SDK


We're going to take you through how to install and run your own Android emulator, plus do a little hacking. We'll also 'root' the emulated device so we can hack-in Market support. The SDK is useful beyond this as well, as you can use it to screen grab an Android device (for example your phone).

The Android emulator is a development tool, it's not like the Amiga, Spectrum and console emulators you may have tried over the years. The emulator isn't optimized for speed and it runs on a single thread, so our old 2.8GHz Core 2 Duo felt sluggish. A Core i5/7 model with Turbo Boost is going to help things along here. That aside everything inside Android is available to play with, apart from the Market. It all runs well too, apart from web browsing, which does seem to chug.

The emulator also comes with a number of useful options and additional compiled operating systems to play with. If you follow our hacking guide you'll see how you can set the emulator to start from a command line. This is mostly useful for accessing advanced test features, such as adding GPS locations, adjusting audio and routing fake calls. There are more basic options, such as no-boot-anim that speeds up start times by eliminating the opening animation.

Editing features

The built-in emulator controls tend to be more easily accessible. For instance, when you click Start the ghosted-out Scale display option provides an easy way to control the size of the emulation window without affecting its resolution. Usually five-inches will get an oversized window under control. Alternatively, select the device, click 'Edit' and under the Hardware section you can add and adjust a number of attached emulator features. These include adding a virtual D-Pad, keyboard and controls over partition sizes and memory allocations.

Bless Google, they do try, but in the scrabble to just get the thing out of the door, it feels like the odd useful item gets forgotten. One such item is any sort of built-in way of taking screen grabs and being able to fetch them off of an Android device. Thankfully someone at Google remembered, because part of the SDK debug tool is a built-in capture option and we've outlined just how to go about getting that to work in our last walkthrough.

Grabbing video

As it turns out you can take this feature one step further and use it to grab video footage, of a sorts. There's a damn handy Java tool called Droid@Screen available at blog.ribomation.com/2010/01/droidscreen. This is a standalone Java tool that can hook into the Android SDK ADB tool to stream your Android device's output to your desktop. It was originally devised as an easy way to project an Android screen via a PC.

Download and run the Java tool and the first thing that it'll do is ask you where ADB lives. This will usually be C:\Program Files (x86)\Android\android-sdk\platform-tools, unless you installed the SDK elsewhere or have a 32-bit operating system, in which case knock off the (x86).

If the Android device is connected it should pick ADB tool up and display it for you. If you want to capture video then use camstudio.org to capture what's going on in its window. You are even able to specify an actual region so you can grab exactly the Android screen area to cut out the borders of the window.

PART I : Emulating Android


Run an Android tablet on your desktop PC, in three easy steps :

1. Snag the SDK


The key to getting all of this good stuff up and running is the kind heart of Google that provides the Android SKD freely for download. Head over to developer. android.com/sdk and grab the latest Windows executable, at around 32MB this is only a pre-installer. The rest of the Android SDK is downloaded as optional extras enabling you to install versions from 1.5 through to 3.1. We suggest grabbing it all.

2. Enjoy Some Java


Android as a platform is Java-based, so having the right version of Java is somewhat important. Oracle, now the owners of Java, provide a bewildering array of variants and it's also likely you already have a Java Runtime Environment, or JRE, installed on your system. To use the Android SDK you need a Java Development Kit, or JDK, from bit.ly/bMkbpo we suggest grabbing the JDK + Netbeans.

3. Paranoid Android


Click the top-left Virtual devices entry, this lets you create and manage all of your virtual Android devices. Click the 'New' button to get this bandwagon rolling. Give the device a name, ours is 'marvin', choose a suitable Android Target; 3.1 will do and enter a 512MB SD card size. You can keep the other settings as is, though feel free to pick your own resolution or drop the density from 160 to make icons larger.

PART II : Setting up Shop 

Get More from your emulator, enter into the Market

1. Secret files

Taking the 'marvin' device we created earlier you're going to have to copy some files around, push some files to the emulator and generally bully Android. To start, copy the system.img file from x:\ Program Files (x86)\Android\android-sdk\platforms\android-11\ images to x:\Users\\.android\avd\marvin.avd replacing the ‘x’ and with your own.



2. Follow the path

To make life easier add the following paths to the Windows PATH variable, do that by right-clicking on My Computer > Properties > Advanced system settings > Environment Variables... select 'Path' from the System Variables box and click 'Edit'. Add the two paths below separated with a ';"x:\Program Files (x86)\Android\android-sdk\ platform-tools' and 'x:\Program Files (x86)\Android\android-sdk\tools'.

3. Grab the APK

Two application files called GoogleServicesFramework.apk from bitly/inxlei and Vending.apk from bit.ly/i9UcYL need to be pushed to the emulator device. Download, extract and place them on your Desktop. This all has to be done from a command line so select Start, type 'CMD' and Press [Return] to open a command prompt and type 'CD desktop' to change directory to where those apk files are.

4. It's a shell start

Fire up the emulator as a background process with a 200MB partition by typing the following: 'Start /B emulator -avd marvin -partition-size 200'. Type 'adb shell' to open a virtual shell for the device. This lets you communicate with the emulator's system kernel. Make the system partition read/write and root the device with: 'mount -o remount,rw -t yaffs2/dev/block/mtdblock0 /system'

5. Push it real good

We've assumed in the last step that the system partition is mtdblock0. If it's not you'll need to type 'mount' and make a note of the system partition and adjust accordingly. Finally type 'rm /system/ app/SdkSetup.apk' and exit. Back at the command prompt type these two lines: 'adb push GoogleServicesFramework.apk /system/app' and 'adb push Vending.apk /system/app' and the Market icon will appear.

6. Beat the system

It's likely it won't log in, if this is the case in the emulator select Settings > Applications > Manage Applications > All. You need to find the GoogleServicesFramework task and select 'Clear all data' and 'Force Close' then do the same for the Market task. Restart the emulator from the shell. This step is a little hit and miss, it's important not to start the main SDK Manager and leave 'Snapshot' unselected.



15 August 2011

Zorin OS It's Linux Designed Especially for Windows Users.

  The Default Zorin OS Desktop

Zorin OS is an Ubuntu derivative with an interface set up to resemble the Windows desktop.

This his release didn't inherit Unity and instead recycles Gnome 2.32.

The statement read: "We have included new features such as an installer welcome video, a new theme and updated artwork simplified application names, updated software and many program changes to improve and simplify the user experience." 



The shift window switcher in Zorin OS

Zorin OS actually comes in several versions. Core is free to download, but the Ultimate edition asks for a "donation" of 970 INR + 200 INR shipping, or 650 INR to download. Zorin OS Lite, Educational, Business, Multimedia and Gaming versions will soon be available for a donation as well.

Zorin OS strives to look like Windows, but furthers the connection by including Wine. PlayonLinux and WineHQ. It's interesting to note that when Notepad is usually listed in a Linux distribution menu, it opens Gedit or the like. But in Zorin. it opens Wine Notepad. Internet Explorer and Wine File Manager are also included. Linux software includes LibreOffice 3.3.2. Empathy. Evolution. Chrome. Banshee, Brasero, Cheese, VLC. Gimp and Shotwell.

Zorin OS is also available on Zorin touchscreen laptops with a 1.6 GHz Intel Atom processor. 2 GB of RAM. and a 160GB hard drive. 


11 August 2011

Future of The Hard Drive


All the recent fuss about solid state storage might make you think that the traditional hard drive is not long for this world. Capacities have increased, but the basic design is still recognizable from the very first IBM RAMAC drive, introduced in 1956, albeit in a considerably smaller form. Surely the future of storage can't be based on something so primitive?

Think again, because nothing comes close to a traditional hard drive for storing the ever increasing amount of data the modern world produces.

And while the physical limits of current hard drive technology are rapidly being reached in terms of the number of bits that can be stored within an inch square of platter, it won't be too long before a two terabyte drive looks like a floppy disk to us today. That's thanks to a new way of getting information on and off of a platter called, excitingly, 'heat assisted'.

The basics of a traditional hard drive are quite simple. Data is stored on circular platters made from a glass and ceramic mix, sometimes aluminum, and coated in a thin layer of magnetic material made of varying mixtures of cobalt, chromium, tantalum, nickel and platinum on the top and bottom.

In a desktop drive, these platters spin at 7,200rpm. That rises to 15.000rpm for a top performance server drive and drops to 5,400rpm for general laptop storage. It means the cuter edge of a platter is moving at around 67mph while the drive in use.

Both sides are recordable, so for high capacity drives, read/write heads are sandwiched between platter layers with one head for each surface. These heads contain three elements: two magnetic coils for reading and writing data to the platter, and one air bearing, which helps the head maintain a steady distance of just a few nanometers above the disk surface. Hard Drives just keep getting bigger. Seagate has just announced a family of drives that can fit a full 1TB of data on each platter, which is 625GB per square inch. It's reckoned that drive capacities have doubled every 24 months or so over the last 60 years.
 
Now and then
Click image to enlarge

At the moment, drives work using a technique called 'perpendicular magnetic recording' (PMR). The recording layer on a platter surface is filled with tiny molecular particles called 'grains'. A single bit of data takes around 100 grains to store securely, and the thick to increasing capacity is to cram more of these grains into a smaller space.

In a PMR disk, grains are arranged at right angles to the platter surface, so they're standing up. Previously, in 'longitudinal magnetic recording' they were arranged end to end horizontally. It stands to reason that you can squeeze more in with the newer technology, which has been commonplace for the last five years.

Physical limits


The problem that has always dogged hard drive manufacturers is that there are physical limits to the number of particles you can fit into a square inch before grains begin to randomly flip their charge and destroy data. PMR is already getting close to those limits. In order to carry on increasing capacity at historical rates, something completely new is required.

That something is looking increasingly likely to be 'heat assisted magnetic recording', or HAMR for short. A steering committee called the Advanced Technology Consortium was recently set up by the International Disc Drive Equipment and Materials Association (www.idema.org), which includes representatives from all the hard drive manufacturers, to produce a common roadmap for the shift to HAMR technology. This has a strong precedent, thanks to a similar initiative to help transition hard drives from the decades-old method of laying down information on a hard drive in 512 byte logical sectors to a larger, more efficient 4kb technique. That transition was finally completed this year, so hopefully the next goal will be met just as smoothly.

But what is HAMR? Basically, researchers discovered several years ago that heating up a magnetic surface prior to writing information to it can increase the accuracy and efficiency of write heads astronomically, while cooling them down improves the ability of a read head to take that data back.

The future involves a small and highly focused laser, mounted on the drive head, which heats up the area of the platter about to be written to. This area then rapidly cools down as the drive spins ready for long-term storage and reading operations.

There are a few details to be sorted out, such as whether a laser point is better at two or ten nanometers, that kind of thing, but ultimately it should lead to drives capable of cramming ten times as much data into the same amount of space that they use today and at little extra cost.

Tech demos

We mailed to Rich Rutledge, Vice-President of storage giant, Western Digital, about this new HAMR technology and how it's being implemented right new. "We've all (hard drive manufacturers] done demos of the technology that have demonstrated functionality, but we haven't quite crossed over in terms of technology yet," he said. So it is up and running, but at the moment it's not ready to be simply dropped into the sort of hard drives we've got backing up our media libraries at home.

We're able to use last year's technology with heat assisted. What we're not able to do yet is next year's technology with heat assisted," says Rutledge. But hopefully the technology isn't too far off. The first drives to use actual heat-assisted write heads could be here within the next two years.

10 August 2011

The Death of DirectX


Time was, the latest version of DirectX would render previous versions outdated and redundant but lately things have changed. The first version of DirectX 9 appeared way back in December 2002 (older than most Justin Bieber fans, then) and yet we're still seeing the most eagerly anticipated big budget releases built around DX9 almost a decade later.

It isn't as though DirectX 10 and 11 have nothing to offer. You only have to load up Crysis to see what's possible when a development studio goes to town with the new features a graphical API has to offer. The jungles of Crysis offered up motion blur, post-processing and lighting effects that we simply hadn't seen before. It caused quite a stir, you might recall.
 
Things looked bright in the early days of DX10. Bioshock's idiosyncratic graphics style made good use of the API, and For Cry 2 made war-torn African tundra look appealing. No one matched Crysis' DX10 smorgasbord though. Perhaps they were put off by the fact that no one had a PC that could run it. Developers seemed to be reigning in on DX10. Instead of making a game take a giant leap forward into post-console visuals, DX10 releases looked increasingly similar to those made with old man DX9.

There were bigger problems emerging, too. DirectX 10 was only available to gamers who'd taken the plunge and upgraded their OS to Vista. Many were perfectly happy with XP and felt they didn't need it; others bought it and found a myriad of problems from front end to driver support. DX10 then, was bound to an operating system few liked. The gaming world lost its appetite for Microsoft's new API.

The Journey of an API


In late 2009 DirectX 11 was released. It had a lot of wrongs to put light, and a lot of mistakes to avoid. This time it was tied to an operating system that people did like, and showed impressive potential for gaming graphics and performance... and yet we still haver't soon the standout DX11 title. What we have seen are some impressive DX9 games, and even the odd OpenGL title.

Throughout the nineties. DirectX and OpenGL locked horns. They scrapped to be the dominant application programming interface (API) to replace the proprietary interfaces that proceeded them. Each wanted to be the API that graphics card manufacturers would write driver support for, and that game developers would use as their toolset. SGI's OpenGL had the upper hand until the end of the decade, thanks to an alliance with id Software and the Quake engine that powered Half-Life in a heavily modified form, not to mention its subsequent blockbuster sequels to the Doom and Quake franchises. With each new game, the id Tech engines expanded their graphical features in parallel with OpenGL, and proved enticing engines for other developers to build their own games with.
 
As the decade ended, Microsoft muscled OpenGL off the API throne with DirectX 8. For the first time, it not only matched the latest OpenGL, but surpassed it by including vertex and pixel shader support. Microsoft also launched its Xbox, hitting the shelves late in 2001, which ran on an operating system similar to Windows NT, and used a Windows API-DirectX 8.1.

This opened the door for cross-platform releases. Developers had a shared toolkit across the PC and Xbox platforms that made programming easier. Anyone who tried to play Resident Evil2 or the like on PC will remember the calibre of cross-platform games - or console ports - developed with the console in mind, then re-coded for PC before this point. The vast majority were all but unplayable: controls were poorly translated to our beloved keyboard and mouse and game code was unstable and glitchy. So, unsurprisingly console games never sold particularly well when they finally made their way onto PC.

The game that bucked that trend was Halo. It was originally developed for PC and Mac, but when Microsoft bought Bungie Studios they saw it made sense to use the buzz the game was getting to boost their fledgling Xbox. As d result, we had to wait two years for Halo to make its way to the PC, grinding our teeth as Xbox owners banged on about how freaking' good it was.

When it actually was released on PC in 2003. it tame with DirectX 9.0 support. Perhaps Bungie felt bad about making the people who'd supported their game from the start wait so long, but the fact was the PC release offered more than the original console release-shiny new DX9 visuals that weren't possible on Xbox.

Halo's FC incarnation wasn't perfect. Frame rates were low and performance was often choppy - but it was playable, mouse-friendly and pretty, and that release set the protocol for cross platform releases through the next decade. Sharing the DirectX API across platforms made for less buggier games. DirectX 9 was a quality API that allowed a lot of flexibility beyond standout visuals; OpenGL had been well and truly muscled out of the action while Microsoft sank its fingers into the warm pies of PC and console gaming.

Hard History Lessons

 
Tie current generation of consoles also proved to be pivotal for DirectX, but this time for the worse.The Xbox 360 supports a modified build of DirectX 90c with Shader Model 3 support. Sony's PlayStation 3 natively uses a flavor of OpenGL. Here in PC land, we're on DirectX 11, which is no big deal to us since we can easily stick in a new GPU and unlock a new generation of graphics technology, but on consoles, the native API is ruler for life.

That means if a developer works hard on integrating DirectX ll's tessellation and multi-core rendering into a PC title, they then have to code it back out for the console version. Whether or not that developer has built their engine from the ground up, that's an expensive and time-consuming undertaking. Given the small profit margins PC titles can expect to generate, it's simply not worth it for many game studios.

Is that why we've seen even big-budget blockbusters like Crysis 2 and The Witcher 2 stick with DX9.0c? Bartlomiej Wronski, graphics programmer for The Witcher 2 explains: "I think the main reason not all developers - including us - provide DX11 support is that supporting two totally different rendering APIs requires much more work and testing. It also creates pipeline difficulties. We would have had to decide which features have fewer fast DX9 fallbacks, which are rendered in lower quality and which arc totally dropped and this would have meant even more work for our artists, programmers and testers.
 

"The Xbox [360] uses an API similar to DX9, with only few extensions from DX10/I1, like texture arrays or hardware tessellation, so to create a multi-platform engine, we had to base ours on DX9. We understand it's an ageing API, and that DX10 and 11 have new features and performance - it came down to balancing these and other considerations."

But there's more to consider than consoles, says Wronski. it's not as if every PC gamer has a DX11-ready machine: "The main reason games still implement DX9 is to support the many players whose PCs run Windows XP and older graphic cards. Fortunately. DX9.0c with shader model 3.0 is still a pretty powerful API, which allowed us to create a rendering engine capable of creating beautiful locations, characters and scenes"

So developers are faced with a fairly small user base when developing a DX11 title, compared to the prospect of a cross-platform release and reaching gamers with dusty old XP machines by working with DX9.0c. The goalposts are constantly shifting though, as gamers are able to upgrade their machines. This makes for an uncertain future for DX11, as Wronski says: "While DX11 is ready a very good and fairly mature API, many of its capabilities aren't widely used yet, go it's hard to speculate about new versions. All its cool features like geometry shaders texture arrays and dynamic hardware tessellation are still awaiting broader implementation. Direct Compute could also help a lot: vendor-independent GPU computing is a feature not only graphics can benefit from, but also particle and physics simulation."

The number of DX11 gamers is growing and they will dominate the market soon, so including it is an obvious, important next step in engine development. "We believe that it may be up to the next generation of consoles. The usage of APIs in future games depends on the capabilities of the next-gen console GPUs," says Wronski.

So from a developer's perspective. they'll blink when gamers dc. When there are enough DX11 end users open-mouthed for flashy games, it becomes commercially viable to make use of DX11 features - and if eighth gen consoles support DX1I natively, that viability more than doubles.

The Downfall of DX10


So what's stopping everyone from buying into DX11? Probably its predecessor, and the infamous operating system it was bound inexorably to.

When Microsoft released Windows Vista, it's fair to say it took a bit of a kicking. There were driver issues, front end gripes, performance problems, all reflective of the enormous changes under the bonnet. But perhaps Vista's biggest problem was that Windows XP still worked just fine. People weren't chomping at the bit for a new operating system, they were settled into XP as one settles into a comfortable chair. The incentive to jump from that comfortable chair onto the Indian bed of nails that many regarded Vista to be was DirectX 10. XP would never get DX10 support, if you wanted to play DX10 games, you were forced to buy Vista. That was something new for Microsoft. Previously, new versions of DirectX had been unbound to any particular operating system.

You also needed a new DX10 graphics card, and that was no small change. Suddenly, you'd gone from comfy old XP and a healthy bank account to an unfamiliar OS and empty pockets. Why put yourself through it? Well, for one reason, Crysis.


Remember how quickly things moved during the emergence of 3D gaming? How you'd sit there, gazing in wonder at Doom, wondering what kind of transcendent visuals Doom II would have? Well, Crysis looked like that to our 2007 eyes. It was simply one imperial unit of graphical quality higher than anything else. It was clear Crysis was going to need a brute of a PC to p ay, and importantly it generated enough enthusiasm from PC gamers that they went out aid bought those brutish PCs.

It was only when the dust had settled after Crysis that gamers found faults with DX10 itself. First, only Crytek seemed capable of drawing those cutting-edge visuals out of the API. It had set the bar high, but other developers didn't get near it. Second, DX10 wasn't a performance-enhancing API. In fact all games that shared the API seemed to also share terrible frame rates. Gamers that bought into DX10 were hugely disillusioned.

Current Climate 

It would be understandable then if PC gamers had their trepidations about DX11. This time it isn't bound to a Windows OS, in this case Win7 as its Predecessor was, but it does require a
new generation of graphics card and the subsequent outlay.

And yet, the trusting doe-eyed PC gaming community, never one to boycott a company or product based on even the slightest mishap or glitch and then spend months furiously typing obscenities about that company or product in their forums long after everyone stopped caring, did have faith in DX11. And they did buy into it.

Matt Ployhar is the president of the PC Gaming Alliance. Senior Product Planner at Intel and former Microsoft employee with experience in MS Games Studios and the Windows/DirectX team through the DXI0-11 development phases. He's as informed as anyone on Microsoft's API, and explains that there are in fact plenty of DXII gamers ready and waiting for DX11 games. "The DX10 to 11 GPU-discrete Install base alone is now sitting somewhere around 230 million unique PC gamers across every segment and geography. The problem for DX10 and II isn't the Total Available Market (TAM) or install base, which incidentally is larger than all the seventh gen consoles combined, the problem is that it's more analogous to being like a largely untapped oil field"

So unlike CD Projekt's vision of the PC landscape, Ployhar sees DX11 as a graphics revolution about to happen if developers would only tap that oil field. So, why aren't there more DX11 games in development? "Good question," says Matt Ployhar. "There should be. DX11 would likely be the most robust option available presenting the least amount of compromise to one's vision for a game.

"The real problem is probably the lack of incentives, carrots, whatever you want to call them, for making PC games in general. Your console manufacturers can spend 100's of millions a year to single digit billions for a specific platform; and that equivalent spend doesn't occur on the PC"

It seems that the only thing holding DX11 back is time, rather than any failing with the API itself. Perhaps it's predecessor DX10 didn't deserve its bad rep. It paved the way for DX11, for starters:'Getting people to switch to a new API is never an easy thing to do. It took several years for Microsoft to get the foothold and market share it did with D3D. This obviously didn't happen overnight. Nor should we expect it to happen with DX10-11 and beyond," says Ployhar.

"DX10 didn't really fail. (The API) was going up against a large established DX9 TAM and Install base. DX9 is very mature in terms of support, familiarity, tools, and so on. Once DX10 - which has matured into DX11 - becomes more mature and prevalent we'll start seeing the ocean turn in favor of what is effectively the DX10 code path"

Okay everyone, take a knee and let's gather our thoughts. DX10 got a tough time because it came tied with Vista, and at a time when the PC's financial ecosystem was - and is - being ransacked by piracy. Thus, cross-platform releases made for safer releases and that marginalized DX10 coding. However, DX10 grew into the beautiful butterfly that is DX11, and it seems gamers are ready and waiting to see what developers can achieve with it. The past few years might have changed the way we look at DirectX, but it's still as important to PC gaming as ever.



09 August 2011

Regain a Dead Smartphone

What should you do when your Smartphone decides to play dead? Resist the urge to throw it against the nearest wall and try one of these techniques instead.

Soft Reset
 
All phones have a soft reset function, which is similar to restarting your computer. Beware that performing a soft reset will cause you to lose any data that isn't saved, but you will retain information previously stored on your smartphone.
 
■ MOTOROLA BACKFLIP. Power the phone off. Remove and reinsert the battery, then power the phone back on.

■ ANDROID (OTHER). All remaining Android models use a simple power cycle to perform a soft reset. Just turn the phone off and then back on again.
 
■ BLACKBERRY (QWERTY KEYBOARD). Press and hold the ALT-CAP-DELETE key combination. The display goes black for a second and your BlackBerry resets.
 
■ BLACKBERRY (SURETYPE KEYBOARD). Press the ALT-CAP and Right Shift-DEL keys. When the screen goes blank, release the keys.
 
■ BLACKBERRY (TOUCHSCREEN). Turn the BlackBerry off and remove the battery for at least 30 seconds. Reinstall the battery and turn the device back on.
 
■ IPHONE (ALL MODELS). Press and hold the Sleep/Wake button on the top of the iPhone and the Home button. Continue to hold both buttons (approximately 10 seconds) until the screen goes blank. You'll see the white Apple logo as the iPhone reboots.
 
■ NOKIA (ALL MODELS). Power the phone off and remove the battery for 30 seconds. Reinstall the battery and power the phone on. Alternately, you can enter the code *#7380# and select Yes.
 
■ PALM PRE. If the phone's menus are still active, select Device Info, Reset Options, then select Soft Reset. If the Palm Pre is locked up or frozen, hold the power button and cycle the ringer button on and off three times. If that doesn't work, press and hold the Orange, Sym, and R keys until the device reboots. Turn the phone off, remove the battery for 10 seconds, reinstall the battery, and power the phone up.

All other smartphones. You can generally perform a soft reset by powering the phone off, removing the battery for 30 seconds, and powering the phone back on.
 
Hard Reset 
A hard reset is a last-ditch option that returns your phone to its factory settings, which means you will lose all data and installed applications. Before you perform a hard reset, remove the memory card from your phone; that way you can recover data from the card later.

AT&T TERRESTAR GENUS. 
With the device turned off, press the red power key. When the TerreStar logo appears, press and hold the E-Power keys until a green checkmark appears in the lower-left corner. Release all keys. The device will power up and perform a factory reset.
 
ANDROID (ALL MODELS WITH FUNCTIONING MENU SYSTEMS).

One of the following menu-based systems for performing a hard reset should work, depending on the phone and version of Android. Open the application menu. Tap Settings, SD and Card Storage, Factory Data Reset, and follow the on-screen instructions.
• From the Home screen, tap Menu, Settings, Privacy, and Factory Data Reset, and then follow the on-screen instructions.
• From the Home screen, tap Menu, Settings, Security, and Factory Data Reset, and then follow the on-screen instructions.
When the menu system isn't functional, follow these phone-specific options to perform a hard reset.

 DELL VENUE. With the device turned off, press and hold the Volume Up and Volume Down buttons. Without releasing the buttons, press and hold the Power button. When the device configuration screen appears, release all buttons. Use the Volume Up or Down button to move the selection to Factory Reset. Press the Camera button to select the Factory Reset option and start the reset process.

 GOOGLE NEXUS ONE, NEXUS S. Turn the phone off. Press and hold Volume Down while you press and release the Power button. Use the Volume Down button to select Clear Storage from the list of options. Press the Power button, and confirm your selection by pressing the Volume Up button.

 T-MOBILE COMET. If possible, back up your data to Google's servers by selecting Privacy from the Settings screen. Select the Back Up My Data option. When the backup is complete, return to the Settings screen and select Privacy and Factory Data Reset. When prompted, tap Reset Phone, then tap Erase Everything.

■ T-MOBILE G2X. If possible, back up your data to Google's servers by selecting Privacy from the Settings screen. Select the Back Up My Data option. When the backup is complete, power off the phone. Press and hold the Power/Lock-Volume Down keys for at least 15 seconds. The phone should turn back on and perform a factory reset. If the screen is frozen, or the phone doesn't turn back on, remove the battery, wait 30 seconds, then reinstall the battery and try again.
 
 MOTOROLA DROID. Turn the phone off. Press and hold the Power-X keys to force the phone into recovery mode. Next, press and hold the Volume Up-Camera key to display the recovery menu. Select Wipe Data/Factory Reset from the menu, and then select Reboot Phone.

 MOTOROLA DROID PRO, DROID 2 GLOBAL. Select Settings, Privacy, and Factory Data Reset. When prompted, tap Reset Phone to erase all data and return the phone to factory conditions.

 MOTOROLA BACKFLIP. Power the phone off. Press and hold the Power and Camera buttons. When the phone turns on, release the Power button but continue to hold the camera button until prompted to release it. Next, press the Volume Down button. After 15 seconds, a yellow triangle with an exclamation point will appear. With your phone closed, tap the bottom-right corner of the display and select Wipe Data/Factory Reset. Press OK and follow the on-screen instructions.

■ BLACKBERRY (ALL MODELS). Remove the battery for 30 seconds. Reinstall the battery and turn the phone back on.

■ BLACKBERRY STYLE, BOLD, STORM, CURVE, TOUR, TORCH. Click the Options icon on the Home screen. Select Security and then Security Wipe. Select all three of the available checkboxes to perform a complete wipe and reset the device to factory condition. Type the word BlackBerry and click Wipe.

■ HTC ARRIVE, HD7, SURROUND. Press Start and tap the right-facing arrow. Tap Settings, About, and then tap Reset Your Phone. Tap Yes, and then tap Yes again.
If the screen is frozen, turn the device off. Press and hold the Volume Up-Down buttons and briefly press the Power key. When the screen displays instructions for resetting the device, release the Volume Up-Down buttons.

■ IPHONE (ALL MODELS). From the Home screen, tap Settings, General, Reset, and Reset All Settings. This action resets all preferences but retains applications and data. If that doesn't work, from the Home screen, tap Settings, General, Reset, Erase All Content, and Settings. This will delete all data and applications and return the iPhone to factory conditions.
 
■ NOKIA (ALL MODELS). With your phone powered on or in standby mode, type *#7370# and select Yes, when prompted. You may need your Lock Code for confirmation. The default lock code is 12345. If your phone doesn't turn on, try pressing the On/Off button, *, and 3 simultaneously.

■ MICROSOFT WINDOWS PHONE 7 (ALL MODELS). Press Start and tap the right-facing arrow. Tap Settings, About, and Reset Your Phone. Tap Yes, and then tap Yes again.
 
■ PALM PRE. Open Device Info, tap Phone Reset Options, and then tap Full Erase twice. If your Palm Pre is frozen, and you are unable to use the menus to perform a reset, try running the latest version of webOS Doctor (ws.palm.com/webosdoctor /sorry.htm) to troubleshoot and reset the device. Then follow the on-screen instructions. 


08 August 2011

It's time Apple built antivirus protection into its OS

Apple, allow me to introduce your left hand to the right. It might make it easier to communicate important facts to customers, such as whether or not OS X users should run antivirus software. The issue of Mac security popped up again with the arrival of Mac Defender, a fake antivirus package that fooled many Apple customers into handing over credit card details to remove imaginary security issues. Described as a tipping point in the Mac malware timeline, Mac Defender marks the point at which Apple needs to grow up, and shows what it needs to learn from Microsoft: give as much information as possible and protect everything.

Apple appears to be where Microsoft was in 2005 before it started to include Windows Defender with its operating systems, following two years' work on "Trusted Computing". Apple, by contrast, has its head buried in the sand, pretending the malware problem doesn't exist, despite security experts predicting a rising tide as the Mac becomes more popular.

The problem isn't simply that there are now more Macs in circulation, making the platform a juicier target for cyber-criminals. The security scare is compounded by Apple customers themselves. Spoon-fed stories about the safety of the platform, consumers have become blase about the dangers. One security expert we spoke to said that although Apple represents only 10% of the PC market, Mac owners are perhaps ten Limes more likely to "fall for" scams such as Mac Defender, because they haven't had years of attacks to encourage a sense of skepticism - unlike Windows users.

So what is Apple telling customers about how to deal with security? The message seems to be don't bother with it, largely. "Mac OS X doesn't get PC viruses. And its built-in defenses help keep you safe from other malware without the hassle of constant alerts and sweeps," the company shouts in bold at the top of its security page. "Every Mac ships with a secure configuration, just turn your Mac on and start working. When you need to be aware of something, it will let you know."

Except when it doesn't. Apple took three weeks to acknowledge Mac Defender, instructing support workers to wash their hands of the issue. "AppleCare does not provide support for removal of the malware. You should not confirm or deny whether the customer's Mac is infected or not," the company dictated in a leaked internal memo.

The only admission Apple makes of malware threats is a footnote on its security page: "Since no system can be 100% immune from every threat, antivirus software may offer additional protection." It's a bit like saying condoms "may" help prevent the spread of STDs on college campuses.

Looking for more definitive advice, I headed out to the Apple Store in Regent Street, posing as a potential customer. "I have Kaspersky on my Mac," confided the sales assistant - we'll call him C*r*s to spare the helpful expert from a run-in with Apple's secret police.

"It's hard for Macs to get viruses, so criminals are trying to panic you into getting some kind of bad software,"said one of my friend. "To be honest, I've got a firewall and a Kaspersky program, so now I feel that no-one can trick me. You're talking about a pretty expensive machine, so you'd want to protect it."

A support worker on Apple's helpline was less forthcoming, saying only that "we don't recommend installing security software, but you can if you want to for your own peace of mind".

And what about Apple itself? Does the company really trust its own OS so much that it sees no need for corporate-wide antivirus? Some security experts believe it does use security software, but the company wouldn't tell us either way.

So it's mixed messages all around. Until Apple actually puts function before marketing fluff and follows Microsoft's lead with built-in anti-malware protection, the security threat is only going to get worse.

Security experts have told us that it's actually no more difficult to write malware for Macs than it is for Windows machines - it's just that there's a smaller pool of virus-writing talent plying its trade in OS X. That could change with the release in underground cyber-crime forums of a DIY crime-ware kit for Macs that could allow almost anyone to write code to steal passwords and inject code into websites. And with Mac users seen as more affluent than the norm, there is plenty of motivation.

As we wait for Apple to respond to the growing threat, and in the absence of guidance, the best example comes from the sales staff of Apple. If it's important enough for my friend, then it should also be important for customers.