It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
http://www.rockpapershotgun.com/2010/03/10/unlimited-detail-wants-to-kill-3d-cards
pretty cool, if it ends up being a realistic way to make games. i miss voxels anyway.
Post edited March 10, 2010 by captfitz
Voxels had a more .... natural look to them. Especially when polygons were still rather basic. They did have a tendancy to look terrible up close, though, but if this new system can pull off extreme detail while being practical, they could be on to a winner. Having said that, Nvidia is as corrupt as an Italian Prime Minister (i.e. VERY) and would probably pull out a lot of cash to force companies to ignore it.
Amazing. I always preferred voxel over polygons. Did they say when Outcast 2 will be released ? :o
Post edited March 10, 2010 by Cambrey
They seem to have the world's ugliest website. And 16 months? Really?
Wouldn't voxels also make it easier to handle collisions realistically, since stuff has "mass" instead of just being papercraft?
Very interesting. Uhm, they are very vague about what they can and can't do and how they do it, but at least it's a innovative way to look at 3d graphics (the search algorithm part, only showing the points that are necessary). Curious to see if this is all hot air or if they're actually really onto something.
It would be more convincing if they'd actually hired a graphics artist for their presentation. It's hard to tell the difference between "this looks like crap because it's a poor system" and "this looks like crap because we just don't have those sorts of skills".
avatar
Wishbone: It would be more convincing if they'd actually hired a graphics artist for their presentation. It's hard to tell the difference between "this looks like crap because it's a poor system" and "this looks like crap because we just don't have those sorts of skills".

Well, it is unfamiliar technology, so I doubt there are anyway artists who are adept at creating anything pretty to look at either.
Personally I want to see where this technology goes. I don't fully understand the potential of it yet, but I do understand how much more organic artwork can be using points instead of flat surfaces.
avatar
fuNGoo: Well, it is unfamiliar technology, so I doubt there are anyway artists who are adept at creating anything pretty to look at either.

Well, that depends on how it is implemented. I can only assume that standard modeling techniques apply, and it's a matter of which format you export the model to at the end of the process.
avatar
fuNGoo: Personally I want to see where this technology goes. I don't fully understand the potential of it yet, but I do understand how much more organic artwork can be using points instead of flat surfaces.

Again, I'm confused as to what they mean by "points". The way they describe it, it sounds like a model is made up of a lot of points, each with their own coordinates. I doubt that is what they actually mean though, since that would mean that when you zoomed in close enough, you'd be looking between the points straight through the model. Also, it would take up MUCH more memory space than polygons.
The only way I can see that it can possibly work is if the "points" in question are in fact control points for splines or bezier patches, and the renderer then interpolates between them. If they've found a way to do that cheaply and in real time, with full lighting and texturing, then this will truly revolutionize computer graphics as much as they say. It would make no additional demands of the artists, but would simply skip the conversion of the model to a fixed low-poly version, and instead use the fully detailed original model.
Useless. OpenGL offers support to tassellation since years, and we still look at it like a "revolution" made possible by the ugly Microsoft's DirectX proprietary libraries, version 11.
Voxel tried to outmatch the polygon-based way of doing 3D things, it lost because polygons where an overall better and more standardized approach. Now, with programmable GPUs in which you can do almost anything you want to do with pixels and polygon vertexes this "new technology" has zero chance to see the light of day.
avatar
KingofGnG: OpenGL offers support to tassellation since years, and we still look at it like a "revolution" made possible by ... DirectX ...11.

You are greatly mistaken. The OpenGL extension for tesselation wasn't actually made publicly available until just last year, only works on ATI cards (and only select models at that), is totally incompatible with compute shaders (included in DX11+ hardware for the very purpose of handling tesselation and the like) and lacks the "hull" and "domain" shader functionality of DX11's tesselation (DX11 tesselation has three programmable shaders while OpenGL's has only one).
OpenGL's was first, sure, but Microsoft's solution is brand-independent and more customisable.
Post edited March 10, 2010 by Arkose
If I hear word 'Unlimited' again today I will be forced to punch someone. It's interesting nevertheless
Unlimited
Interesting. But it's not difficult to see the shortcomings from that video. No environmental effects and everything is static. Including the water. Even the lighting seemed primitive.
Hopefully newer, better demonstrations will be made of how this technology can actually be used in games. But even that's moot if they don't get developer support. I like the general idea. But this hasn't won me over yet.
Some things puzzle me. I don't understand point cloud search. How can you "search" for things when rendering graphics? Sounds way too pie in the sky to claim "unlimited" detail as well. It's a pretty cool concept, reminds me of voxels (ahh, Delta Force, how I've missed ye), but like others have said, it looks kind of... janky.
And like I said before, 16 months sounds pretty unrealistic. First, they don't have funding. Second, there will be a lot of work to get this up and running. And thirdly, who's to say nVidia or Intel or ATI won't kill this thing somehow? It's pretty obvious that to have great graphics and detail without computing power that won't take up 75% of the National Grid will very much affect their business. And of course, by focusing solely on the geometry of visuals they seem to have forgotten about everything else. Time will tell.