O que há de Novo?
Fórum Outer Space - O maior fórum de games do Brasil

Registre uma conta gratuita hoje para se tornar um membro! Uma vez conectado, você poderá participar neste site adicionando seus próprios tópicos e postagens, além de se conectar com outros membros por meio de sua própria caixa de entrada privada!

  • Anunciando os planos GOLD no Fórum Outer Space
    Visitante, agora você pode ajudar o Fórum Outer Space e receber alguns recursos exclusivos, incluindo navegação sem anúncios e dois temas exclusivos. Veja os detalhes aqui.


GDC17 NVIDIA Show Guide

rickrj

Ei mãe, 500 pontos!
Mensagens
4.361
Reações
2.467
Pontos
984
On NVIDIA's Tile-Based Rendering

Looking back on NVIDIA's GDC presentation, perhaps one of the most interesting aspects approached was the implementation of tile-based rendering on NVIDIA's post-Maxwell architectures. This has been an adaptation of typically mobile approaches to graphics rendering which keeps their specific needs for power efficiency in mind - and if you'll "member", "Maxwell" was NVIDIA's first graphics architecture publicly touted for its "mobile first" design.

This approach essentially divides the screen into tiles, and then rasterizes the entire frame in a per-tile basis. 16×16 and 32×32 pixels are the usual tile sizes, but both Maxwell and Pascal can dynamically assess the required tile size for each frame, changing it on-the-fly as needed and according to the complexity of the scene. This looks to ensure that the processed data has a much smaller footprint than that of the full image rendering - small enough that it makes it possible for NVIDIA to keep the data in a much smaller amount of memory (essentially, the L2 memory), dynamically filling and flushing the available cache as possible until the full frame has been rendered. This means that the GPU doesn't have to access larger, slower memory pools as much, which primarily reduces the load on the VRAM subsystem (increasing available VRAM for other tasks), whilst simultaneously accelerating rendering speed. At the same time, a tile-based approach also lends itself pretty well to the nature of GPUs - these are easily parallelized operations, with the GPU being able to tackle many independent tiles simultaneously, depending on the available resources.

Thanks to NVIDIA's public acknowledgement on the usage of tile-based rendering strating with its Maxwell architectures, some design decisions on the Maxwell architecture now make much more sense. Below, is a screenshot taken from NVIDIA's "5 Things You Should Know About the New Maxwell GPU Architecture". Take a look at the L2 cache size. From Kepler to Maxwell, the cache size increased 8x, from 256 KB on Kepler to the 2048 KB on Maxwell. Now, we can attribute this gigantic leap in cache size to the need for a higher-size L2 cache so as to fit the required tile-based resources for the rasterizing process, which allowed NVIDIA the leap in memory performance and power efficiency they achieved with the Maxwell architecture compared to its Kepler predecessor. Incidentally, NVIDIA's GP102 chip (which powers the GTX Titan X and the upcoming, recently announced GTX 1080 Ti, doubles that amount of L2 cache again, to a staggering 4096 KB. Whether or not Volta will continue with the scaling of L2 cache remains to be seen, but I've seen worse bets.

An interesting tangent: the Xbox 360 and Xbox One ESRAM chips (running on AMD-architectured GPUs, no less) can make for a substitute for the tile-based rasterization process that post-Maxwell NVIDIA GPUs employ.

Tile-based rendering seems to have been a key part on NVIDIA's secret-sauce towards achieving the impressive performance-per-watt ratings of their last two architectures, and it's expected that their approach to this rendering mode will only improve with time. Some differences can be seen in the tile-based rendering between Maxwell and Pascal already, with the former dividing the scene into triangles, and the later breaking a scene up into squares or vertical rectangles as needed, so this means that NVIDIA has in fact put in some measure of work into the rendering system between both these architectures.

Perhaps we have already seen some seeds of this tile-based rendering on AMD's Vega architecture sneak peek, particularly in regards to its next-generation Pixel Engine: the render back-ends now being clients of the L2 cache substitute their previous architectures' non-coherent memory access, in which the pixel engine wrote to the memory controller. This could be AMD's way of tackling the same problem, with AMD's improvements to the pixel-engine with a new-generation draw-stream binning rasterizer supposedly helping to conserve clock cycles, whilst simultaneously improving on-die cache locality and memory footprint.

David Kanter, of Real World Tech, has a pretty interesting YouTube video where he goes in some depth on NVIDIA's tile-based approach, which you can check if you're interested.


Source: NVIDIA Devblogs, Real World Tech
NVIDIA Announces DX12 Gameworks Support
by R-T-B Wednesday, March 1st 2017 01:02 Discuss (23 Comments)
NVIDIA has announced DX12 support for their proprietary GameWorks SDK, including some new exclusive effects such as "Flex" and "Flow." Most interestingly, NVIDIA is claiming that simulation effects get a massive boost from Async Compute, nearly doubling performance on a GTX 1080 using that style of effects. Obviously, Async Compute is a DX12 exclusive technology. The performance gains in an area where NVIDIA normally is perceived to not do so well are indeed encouraging, even if only in their exclusive ecosystem. Whether GCN powered cards will see similar gains when running GameWorks titles remains to be seen.

NVIDIA Announces Public Ansel SDK, Developer Plugins
by R-T-B Wednesday, March 1st 2017 01:04 Discuss (7 Comments)
NVIDIA, Ansel, a framework for doing real-time screenshot filters and photographic effects, has seen the release of a public SDK and a few developer plugins to boot. Unreal Engine and Unity have both gained plugins for the technology, and the tech is reportedly coming to Amazon's Lumberyard engine as well. This should most assuredly aid in the adoption of the technology, as well as open it up to new markets where it was previous unavailable, such as indie game development. The public SDK is presently available for download from NVIDIA directly at developer.nvidia.com/ansel



Shadowplay Now Automagically Records Your Greatest Moments
by R-T-B Today, 01:00 Discuss (2 Comments)
NVIDIA has announced a new SDK for its products known as Shadowplay Highlights. Shadowplay Highlights augments the existing recording game technology of NVIDIA Shadowplay to automatically capture hot moments in your favorite videogame. Whether it's your latest Triple Kill or a particular daring jump on the race track, if the game engine tells the SDK it's significant, Shadowplay spins up, combining previously recorded gameplay with live recordings, to create a perfect video of your glory moment. You can then edit the footage from within the game and directly upload it to a number of social networks.

The technology includes many options for quality or diskspace saving, and anything in-between. Of course, as with all things Shadowplay, the technology certainly will require a GeForce branded graphics card and support from game developers as well. A video demonstrating the technology follows after the break.


NVIDIA Announces the GeForce GTX 1080 Ti Graphics Card at $699
by btarunr Today, 00:47 Discuss (86 Comments)
NVIDIA today unveiled the GeForce GTX 1080 Ti graphics card, its fastest consumer graphics card based on the "Pascal" GPU architecture, and which is positioned to be more affordable than the flagship TITAN X Pascal, at USD $699, with market availability from the first week of March, 2017. Based on the same "GP102" silicon as the TITAN X Pascal, the GTX 1080 Ti is slightly cut-down. While it features the same 3,584 CUDA cores as the TITAN X Pascal, the memory amount is now lower, at 11 GB, over a slightly narrower 352-bit wide GDDR5X memory interface. This translates to 11 memory chips on the card. On the bright side, NVIDIA is using newer memory chips than the one it deployed on the TITAN X Pascal, which run at 11 GHz (GDDR5X-effective), so the memory bandwidth is 484 GB/s.

Besides the narrower 352-bit memory bus, the ROP count is lowered to 88 (from 96 on the TITAN X Pascal), while the TMU count is unchanged from 224. The GPU core is clocked at a boost frequency of up to 1.60 GHz, with the ability to overclock beyond the 2.00 GHz mark. It gets better: the GTX 1080 Ti features certain memory advancements not found on other "Pascal" based graphics cards: a newer memory chip and optimized memory interface, that's running at 11 Gbps. NVIDIA's Tiled Rendering Technology has also been finally announced publicly; a feature NVIDIA has been hiding from its consumers since the GeForce "Maxwell" architecture, it is one of the secret sauces that enable NVIDIA's lead.

NVIDIA's AIB Partners to Launch GTX 1080, 1060 With Faster GDDR5, GDDR5X Memory
by Raevenlord Today, 09:45 Discuss (19 Comments)
At their GDC event yesterday, NVIDIA announced a change to how partners are able to outfit their GTX 1080 and GTX 1060 6 GB models in regards to video memory. Due to improvements in process and scaled-down costs, NVIDIA has decided to allow partners to purchase 11 Gbps GDDR5X (up from 10 Gbps) and 9Gbps (up from 8 Gbps) GDDR5 memory from them, to pair with the GTX 1080 and GTX 1060 6 GB, respectively. These are to be sold by NVIDIA's AIB partners as overclocked cards, and don't represent a change to the official specifications on either graphics card. With this move, NVIDIA aims to give partners more flexibility in choosing memory speeds and carving different models of the same graphics card, with varying degrees of overclock, something which was particularly hard to do on conventional 10 Gbps-equipped GTX 1080's, which showed atypically low memory overclocking headroom.


Retirado tudo do TECHPOWERUP

Sponsored Sessions
Click below on each session to find out more:

Wednesday 3/1/2017
Thursday 3/2/2017
Friday 3/3/2017
Bônus:
 
Ultima Edição:

LHand

Ei mãe, 500 pontos!
Mensagens
18.251
Reações
50.038
Pontos
624
Gameworks + DX12.

Mas que combinação.

E digo mais, que combinação...

de b*sta!!!


***
Em tempo... pela descrição, parece nada menos que uma forma da Nvidia sabotar a vantagem das placas da AMD com Async Shaders. Sensacional, só que não.
 

rickrj

Ei mãe, 500 pontos!
Mensagens
4.361
Reações
2.467
Pontos
984
Gameworks + DX12.

Mas que combinação.

E digo mais, que combinação...

de b*sta!!!

Mais uma atualização : Game Ready Driver Optimised for DirectX 12, aumento de performace em DX12 16% um bom incremento....


The new and updated resources include updates to the NVIDIA GameWorks SDK for creating interactive cinematic experiences on PC games; updates to the NVIDIA VRWorks SDK for creating immersive virtual reality experiences; new developer tools; and a new Game Ready Driver. Nvidia claims these toolsets will give developer access to new rendering and simulation effects, plus "substantial performance gains" to gamers, and more.

Nvidia to provide "best game experience on DirectX 12 titles"

Talking about his firm's commitment to DirectX 12, Tony Tamasi, SVP of content and technology at Nvidia, said "We have invested over 500 engineering-years of work to deliver the most comprehensive platform for developing DirectX 12 games, including the world’s most advanced physics simulation engine". Tamasi said that the new GameWorks DirectX 12 software tools will ensure GeForce gamers get the best DirectX 12 gaming experiences, "just as they have on DirectX 11 games."

Physics and VR

Nvidia wants to make the most of asynchronous compute for gaming effects, so has introduced two key technologies to do so. Firstly Nvidia Flow 1.0 is a visual effects library for the rendering the likes of combustible fluid, fire and smoke, secondly Nvidia Flex 1.1 offers a DX12 compute compatible a unified particle-based simulation technique for real-time visual effects. Another update to GameWorks physics came with the new HairWorks 1.3 for realistic fur and hair simulation.

b69b9e05-4d42-437d-9b99-0dd6e2f7a4cc.jpg


VRWorks has been updated by Nvidia "to support DirectX 12 with better performance, lower latency and plug-and-play compatibility." It will be supported by UE4 games and the Unity 2017.1 beta onwards (ships in spring).

New developer tools have also been introduced by Nvidia to improve DirectX 12 development. New tools include the Nvidia Aftermath 1 diagnostic utility, the Nsight Visual Studio Edition 5.3 real-time VR and DirectX 12 debugger, and the PIX Plugin DirectX 12 performance debugger from Microsoft.

Game Ready Driver Optimised for DirectX 12

Coming shortly will be a new GeForce Game Ready Driver optimised for DirectX 12 games. This driver will offer an impressive performance uplift of "16 per cent on average," across a variety of DirectX 12 games. Titles feeling the DX12 Game Ready driver love will include; Ashes of the Singularity, Gears of War 4, Hitman, Rise of the Tomb Raider and Tom Clancy’s The Division.

Interestingly the aforementioned games titles are the often ones that currently show modern AMD Radeon graphics cards in their best light. Can Nvidia really snatch these DirectX 12 jewels from AMD's grasp? We will have to wait and see.
 

Passo's

Lenda da internet
Mensagens
31.518
Reações
33.033
Pontos
1.674
Tava demorando para dar o boost no dx12 nas nvidia.
 

LHand

Ei mãe, 500 pontos!
Mensagens
18.251
Reações
50.038
Pontos
624
Mais uma atualização : Game Ready Driver Optimised for DirectX 12, aumento de performace em DX12 16% um bom incremento....
Tomara que venha.

Se conseguirem aumentar o desempenho pra jogos existentes em DX12 via um update de driver, será ótimo e bem vindo. Agora essa promessa de compensar o gap de performance dos async shaders via Cancerworks é conversa pra boi dormir, ou melhor, uma forma artificial de dar um boost na performance em DX12.
 

antonioli

O Exterminador de confusões
Membro STAFF
VIP
GOLD
Mensagens
103.212
Reações
172.407
Pontos
2.599
Melhorias em DX12 só acredito vendo.
 


rickrj

Ei mãe, 500 pontos!
Mensagens
4.361
Reações
2.467
Pontos
984
Melhorias em DX12 só acredito vendo.

Mais um site dizendo:

Nvidia are working on a new DirectX 12 optimised GPU driver

Nvidia is now working on a new Geforce driver which is designed specifically with DirectX 12 in mind, offering up to 16% performance gains across a range of DirectX 12 gaming titles.

To create this driver Nvidia has been working with the developers of a wide range of DirectX 12 titles like Ashes of the Singularity, Gears of War 4, Rise of the Tomb Raider and Tom Clancy's The Division in order to refine their driver's codebase to make the best use of this new API. These changes should affect all DirectX 12 games moving forward, allowing users of Nvidia GPU to achieve higher performance levels with this new API than previously.


NVIDIA also revealed an upcoming Game Ready Driver optimized for DirectX 12 games. The company refined code in the driver and worked side by side with game developers to deliver performance increases of up to 16 percent on average across a variety of DirectX 12 games, such as Ashes of the Singularity, Gears of War 4, Hitman, Rise of the Tomb Raider and Tom Clancy's The Division.

Since the first launch of its Pascal architecture -- the world's most advanced DX12 GPU family, including the performance-leading GeForce GTX 1080 Ti and GTX 1080 GPUs -- NVIDIA has continuously improved DX12 game performance through releases of Game Ready drivers. The drivers are timed with the release of the latest games by leading partners.


01095611643l.jpg




Right now it is unknown when Nvidia plans on releasing this new driver, though we do plan on conducting dedicated GPU testing when this new driver drops.

https://www.overclock3d.net/news/gp...ng_on_a_new_directx_12_optimised_gpu_driver/1
 

antonioli

O Exterminador de confusões
Membro STAFF
VIP
GOLD
Mensagens
103.212
Reações
172.407
Pontos
2.599
Mais um site dizendo:

Nvidia are working on a new DirectX 12 optimised GPU driver

Nvidia is now working on a new Geforce driver which is designed specifically with DirectX 12 in mind, offering up to 16% performance gains across a range of DirectX 12 gaming titles.

To create this driver Nvidia has been working with the developers of a wide range of DirectX 12 titles like Ashes of the Singularity, Gears of War 4, Rise of the Tomb Raider and Tom Clancy's The Division in order to refine their driver's codebase to make the best use of this new API. These changes should affect all DirectX 12 games moving forward, allowing users of Nvidia GPU to achieve higher performance levels with this new API than previously.


NVIDIA also revealed an upcoming Game Ready Driver optimized for DirectX 12 games. The company refined code in the driver and worked side by side with game developers to deliver performance increases of up to 16 percent on average across a variety of DirectX 12 games, such as Ashes of the Singularity, Gears of War 4, Hitman, Rise of the Tomb Raider and Tom Clancy's The Division.

Since the first launch of its Pascal architecture -- the world's most advanced DX12 GPU family, including the performance-leading GeForce GTX 1080 Ti and GTX 1080 GPUs -- NVIDIA has continuously improved DX12 game performance through releases of Game Ready drivers. The drivers are timed with the release of the latest games by leading partners.


01095611643l.jpg




Right now it is unknown when Nvidia plans on releasing this new driver, though we do plan on conducting dedicated GPU testing when this new driver drops.

https://www.overclock3d.net/news/gp...ng_on_a_new_directx_12_optimised_gpu_driver/1
Melhorias em DX12 só acredito vendo.
 

Kung Fu-tzu

Ei mãe, 500 pontos!
Mensagens
603
Reações
650
Pontos
544
DX12 até agora é um grande fiasco, foi quase um SO "BAIT"!

Vale lembrar que galera ficou em um hype absurdo diante do futuro lindo e maravilhoso prometido pela Microsoft quanto ao DX12, e vários meses após o lançamento do Windows, não vimos nada de novo, tendo de amargar um pouco mais do mesmo!

Nvidia agora vive do comércio de produtos requentados, onde não há mais evolução significativa nas GPUs, deixando a impressão de que a fabricante quer desenvolver as VGAs bem devagar, dando entre 30 a 20% de ganho por geração, as quais não atraem a grande maioria do público, atingindo somente uma pequena parcela do mercado consumidor!

Verdade seja, é que falta concorrência no mercado, e garanto que se a AMD viesse com novidades realmente relevantes, a NVIDIA sairia da sua zona de conforto, trazendo placas robustas, inovadoras, que empurrassem os games para o patamar do fotorrealismo ou além!

Cabe também salientar que Sony e Microsoft devem pressionar a Nvidia, de forma que os consoles atuais não fiquem tão atras da Master Race, o que, pelo menos de certa forma, iria prejudicar as vendas destes consoles natimortos, os quais carecem de robustez e tecnologia desde o seu nascimento, vide que logo após o lançamento do XONE e do PS4, vieram também versões requentadas dos mesmos!

Infelizmente a Nvidia falou, mas não disse nada! :facepalm
 

RareHero

Bam-bam-bam
Mensagens
9.722
Reações
24.554
Pontos
363
Entra ano, sai ano, o mesmo papo, as mesmas promessas, dai quando chega na hora de por em prática, é jogo feito com o esfincter até uma horas, passou a época que eu me empolgava.
 

rickrj

Ei mãe, 500 pontos!
Mensagens
4.361
Reações
2.467
Pontos
984
Entra ano, sai ano, o mesmo papo, as mesmas promessas, dai quando chega na hora de por em prática, é jogo feito com o esfincter até uma horas, passou a época que eu me empolgava.
Mano, isso com certeza, eu também não me empolgo com estas promessas, até porque nem os jogos também não andam empolgando em nada, as produtoras estão lançando jogos reciclados em 90% dos casos, sem inovação nenhuma, atualmente eu vi a janela de lançamentos do ano 2017 e muito poucos jogos me empolgaram, na real, eu já estou até meio enjoando dos games, e com estes video games fracotes seguram ainda mais alguma evolução e preguiça/pressão(tempo é dinheiro) das produtoras...
 
Ultima Edição:
Topo Fundo