Btw, on a slightly related note, the Last Guardian has been delayed again:
"To ensure that The Last Guardian delivers on the experience that the game’s creators have envisioned,"
-Shuei super Sony master
http://blog.us.playstation.com/2016/09/12/an-update-on-the-last-guardian/
In case people don't remember, this is a game that Ueda Fumito left the Japan Studio umbrella over, to be hired in again as an outside consultant, once it became clear that Sony was not going to develop it exclusively for the ps3, and instead push it for the "Orbis", or the ps4. This also comes at the same time that Kojima, Square Enix, Polyphony Digital, and various other studios are basically told off-hand that the ps3 is dead, and that the next platform will be "multiplatform", or a PC.
You can guess very easily that what this meant for the development of the last guardian wasn't that they ran into unforseen bugs, but that they could 1. not develop code that would leverage custom instruction set capability in parallel on the ps3, and therefore... everything has to be scrapped and redone. 2. that certain trailers of the game that we now know were fake were what emerged after the rebuild of the title after Sony's internal folks took over the porting work. Where the game couldn't possibly run the animation correction and the world updates per frame, owing to that the process diagram on a cellBE for per clock-cycle updates would be physically impossible to compact into a pc platform (owing to the entire bandwidth crunch when substituting the graphics context - if you want to do complex math on the graphics context, you need to replace the vram content every other frame, and the transfer rate between the pci-bus and system ram just isn't fast enough to do this every cycle, or quickly enough to ensure object consistency. CellBE process diagrams on the other hand can be ensured to be consistent if programmed well, because of asynchronous access to system ram via the external processors with complex instruction sets).
This architecture approach is also why the ps2, in spite of fairly weak hardware, was very successful. For example with Shadow of the Colossus, which Ueda made with parts of Studio Japan at the tail-end of the ps2's lifecycle, for example. The entire animation system, world update approach (i.e., the moving disc on the playable area, the climbing on moving objects, etc) comes from the fact that the graphics context can be adjusted with more complex math each cycle than with shader-logic. In Ico it's the same thing - the hand-holding of your ghost girlfriend in that game isn't a canned animation, but something that's adjusted based on environment variables. Not extremely complex, but very difficult to do on more linear gpu/cpu contexts. (Basically no one has actually done anything similar on PC without opencl type operations on the graphics card, until No Man's Sky used the SSE instruction set for mapping world-updates to be rendered/included in the rendering context. Which.. probably is the second useful thing the SSE-instruction set has ever been used for outside when intel made a linux core that booted in 1.5s or something like that).
Anyway. So why bring this up? Just to illustrate how hung up Sony is on making the ps4 a success, and how far they'll actually go to change their productions to achieve that. They've invested immensely in it, and the upgrade to the platform. They've burned any amount of internal studios to become a more traditional publisher of 3rd party developers (who then pay them for access to the platform and sdks, etc., rather than pay for developing interesting tech on their own platform, that used to be what they were doing).
And suddenly it's easily explained why NMS has been cut apart and been stuck in some sort of Q&A hell for the last six months. Because, for example: if you're going to run the Sony OS with the twitch-stream functionality and encoder stuff in the background, the platform (bobcat cores/amd) suffers from reduced thread response on all cores owing to the fact that it relies on shared cache between cores. This has been a known thing with the ps4 (and the xbone) from the beginning, that games simply failed to perform even if they were written to specification. Several developers have commented on this individually, that the cores they can use essentially limit themselves, while thread management isn't exactly straight forward.
So the result tends to be to reduce the background thread intensity by: dropping features, dropping graphics passes not done "for free" on the graphics card in the background, reducing bandwidth usage to avoid vram transfers to avoid bus-crunches, etc. In pretty much the same fashion that the ps3 took so much flak over when it came to ported multiplatform titles.
And if the developer can't magick some extra grunt out of the 4-5 cores they have available, they simply "fail", as Shuei says. They're not "up to the task", as is common to hear in Sony's beta-channel. From not being able to create a build that... basically beats the trailers while running in real-time on the ps4, etc. It's not going to happen with titles that require heavy multithreading, or leverage instruction types that suffer very quickly if cores share cache (which forces context-switches, and therefore increase running time). But Sony have invested so hard in this that they will keep going anyway.
Which pleases me to no end, because I couldn't wish for a worse ending to Sony as a publisher, after all the titles they've bungled with their idiotic marketing-related concerns in the last few years. To their own cost as well, of course.