Putting the 'role' back in role-playing games since 2002.
Donate to Codex
Good Old Games
  • Welcome to rpgcodex.net, a site dedicated to discussing computer based role-playing games in a free and open fashion. We're less strict than other forums, but please refer to the rules.

    "This message is awaiting moderator approval": All new users must pass through our moderation queue before they will be able to post normally. Until your account has "passed" your posts will only be visible to yourself (and moderators) until they are approved. Give us a week to get around to approving / deleting / ignoring your mundane opinion on crap before hassling us about it. Once you have passed the moderation period (think of it as a test), you will be able to post normally, just like all the other retards.

The PS5 and Xbox 2 thread - it's happening

Markman

da Blitz master
Patron
Joined
Dec 31, 2002
Messages
3,737
Location
Sthlm, Swe
Serpent in the Staglands Codex USB, 2014 Shadorwun: Hong Kong
Looks too good to be true, tbh.

 

Taxnomore

I'm a spicy fellow.
Patron
Joined
Oct 28, 2010
Messages
10,073
Location
Your wallet.
Codex 2013 PC RPG Website of the Year, 2015
So which is it ?

Lockhart is so crappy that no one will get it, or Lockhart is actually nice enough and no one will care about the XseX ?
 

DalekFlay

Arcane
Patron
Joined
Oct 5, 2010
Messages
14,118
Location
New Vegas
Supports raytracing... I mean, a GTX 1060 does too I believe, but good luck running anything with it.

If the processor, memory and SSD are similar then it might be a 1080p30 versus 4k60 kind of dynamic like I mentioned above, without damaging fundamental design too much. We're all just guessing until we know more though. I find it hard to believe that simply changing the GPU could let them go down from a super expensive machine to a much more affordable one.
 

mk0

Learned
Joined
Jun 28, 2020
Messages
113
Supports raytracing... I mean, a GTX 1060 does too I believe, but good luck running anything with it.

If the processor, memory and SSD are similar then it might be a 1080p30 versus 4k60 kind of dynamic like I mentioned above, without damaging fundamental design too much. We're all just guessing until we know more though. I find it hard to believe that simply changing the GPU could let them go down from a super expensive machine to a much more affordable one.
Well it has to have the same basic feature set(CPU, SSD, RTX) to be compatible with Series X, so nothing surprising there.

Cutting down on GPU Compute Units(CU) lets them reduce the overall die size of the SoC which improves yields and lowers cost, smaller chip = more usable chips salvaged from die wafers. The XSX has a die size of around 360 mm² of which most is comprised of CUs, so far it's the largest and most expensive consumer-oriented product made on TSMC's 7nm node, the runner-up is AMD's Navi GPU(5700xt 399$ MSRP) which is only 251 mm². Having less GDDR6 memory chips also makes a big difference.
 

mk0

Learned
Joined
Jun 28, 2020
Messages
113
We can do some quick math to speculate how the Series S will turn out, excuse my autism.

If the CPU portion of Series X is 120mm²(80mm² CPU cluster + 40mm² for I/O) then the GPU+RTX portion is 240mm², so 360mm² total. If we cut the GPU portion down to a third(12TF->4TF), we get 80mm²~, therefore the total die size for Series S with the CPU would be around 120mm²+80mm²=200mm². That's larger than the 5500XT(200$) but smaller than the 5600XT/5700XT(300$-400$, these two GPUs use the same die), that's some pretty good cost savings.

Next is the memory portion. The Series X has 16GB of GDDR6 with a split memory pool, 10GB at 560@GB/s and 6GB@336GB/s, 2.5GB from the slower pool is dedicated to the OS leaving 13.5GB. The Series S will instead have 10GB with 7.5GB left after OS. I'm too lazy to try and figure out memory buses for bandwidth, forgive me.

Here's something interesting to think about, if developers decide to fill the entirety of Series X's slower "system" memory pool and leave the other 10GB purely for textures, you end up in bit of a pickle on the Series S. From 10GB of GDDR6 you take out 2.5GB for OS and 3.5GB for system pool and you're only left with 4GB for textures. :lol:

399$ for Series S seems like a realistic price point. GPU is worth around 150-200$, CPU and SSD are expensive but economies of scale will give some leeway on that front.

Edit: I fucked up, 5500XT is 157mm² and that's 4,7TF. However unlike the desktop Zen 2 which has more die space dedicated to L3 cache, the consoles will have less L3 cache like the mobile Zen 2 APUs to cut down costs. So my point still stands I guess?
 
Last edited:

AwesomeButton

Proud owner of BG 3: Day of Swen's Tentacle
Patron
Joined
Nov 23, 2014
Messages
17,133
Location
At large
PC RPG Website of the Year, 2015 Make the Codex Great Again! Grab the Codex by the pussy Insert Title Here RPG Wokedex Divinity: Original Sin 2 A Beautifully Desolate Campaign Pillars of Eternity 2: Deadfire Steve gets a Kidney but I don't even get a tag. Pathfinder: Wrath
i'm so glad since I switched from an iPad to a Samsung Tab S6. No iOS devices at home any more.

You’ve avoided Netflix NFLX 1.74% for years, especially since you’re already paying through the nose for cable. But everybody kept talking—and talking and talking—about “Tiger King.” One night, after crawling into bed, you caved and downloaded the app on your iPhone.

“Trying to join Netflix? You can’t sign up for Netflix in the app. We know it’s a hassle.”

You tap the Help button, which yields this unhelpful note: “If you’re not already a Netflix member, please join and come back.” There’s no indication where you can start your subscription.

What the Netflix app can’t tell you is that the fix is simple: Go to the web browser on your phone or computer and sign up at netflix.com.


Netflix isn’t the only hugely popular app leaving iPhone users in the dark about paying for stuff. You can’t sign up for a Spotify account or Amazon Prime membership in their respective mobile apps. Amazon’s Kindle app doesn’t let you buy e-books. Same with Rakuten’s Kobo app. Amazon-owned Audible has a complicated credit system to download audiobooks on iOS.

These apps are broken on purpose, because of Apple’s lucrative App Store rule: Companies are charged 30% of every purchase and subscription made through iOS apps. (After the subscriber’s first year, the commission is reduced to 15%.) Any developer who wants to make money on Apple’s iPhone and iPad audience must pay a hefty surcharge for that privilege.

In December 2018, Netflix decided it no longer wanted to give Apple that cut, so it stopped letting people sign up in the app.

Blocking subscriptions and payments is just one way developers push back against the App Store’s terms. Here’s another: charging higher rates in iPhone apps. The Tinder app charges $29.99 a month for a Gold membership (which shows you everyone who’s swiped right on you). Tinder’s website charges just $13.49 a month for the same service.

“Apple is a partner but also a dominant platform whose actions force the vast majority of consumers to pay more for third-party apps that Apple arbitrarily defines as ‘digital services,’ ” said a Tinder spokeswoman. “We’re acutely aware of their power over us.”

Google-owned YouTube Music also passes on Apple’s 30% fee to customers. Apple’s App Store prohibits mentioning that a lower fee can be accessed elsewhere, a YouTube spokeswoman said.

Apple’s guidelines say developers can’t list alternative prices or discourage purchasing through the App Store in their iOS apps. An Apple spokeswoman said that developers are free to promote other pricing outside of the App Store, including on television and billboards.

The music-streaming app Tidal charges $12.99 a month for its premium tier on the iPhone but only $9.99 on its website—and Android devices.


Google, which operates the Play Store where the majority of Android apps are downloaded in the U.S., does charge up to 30% commissions on in-app transactions it handles. But its policies aren’t as ironclad as Apple’s. Whereas Apple requires all in-app purchases to go through the tech giant’s own billing software, the Play Store allows an exception for companies that host digital content and use their own payment system. As such, Tidal doesn’t have to pay any fees to Google.

On iPhones, the notable exception is Amazon Prime Video. The app historically circumvented commissions by not offering entertainment rentals or purchases to iOS users. In April, Amazon began using its own payment system to fulfill the purchases.

According to Apple, Amazon is in a program for “premium video providers” permitted to use the payment method tied to customers’ existing video subscriptions. Two European entertainment companies, Altice One and Canal+, are also in the program. But the move did seem like a concession aimed at getting Amazon Prime Video—of which Amazon reports over 150 million members world-wide—to finally work on the Apple TV device.

Google isn’t just more relaxed about payment systems. Android app makers in the Play Store are allowed to tell users to subscribe elsewhere. And because Android is an open ecosystem, people can download their apps directly from developers or through other app stores, and Google doesn’t get a cut. (When Epic Games Inc. launched the popular Fortnite, it bypassed Google’s Play Store for 18 months to evade fees.)

So while Android holds most of the global smartphone market share—around 85% on Android vs. 14% on iOS—Apple has borne the brunt of public and regulatory scrutiny about the App Store’s policies and business model.

“Apple doesn’t have a monopoly on smartphones, but it’s hard to say that they don’t have a monopoly over iOS users,” said David Barnard, an independent developer who’s had three apps in Apple’s App Store over the past 12 years. Besides, Apple is historically better than Google at monetizing apps. “If you want to exist on mobile, you have to go through Apple as a gatekeeper.”

Apple decides what does and doesn’t have to use its billing system. (Yes to games like Candy Crush Saga; no to services like Airbnb and Uber.) That leverage is an issue that was recently brought to the fore when the developers of an email app called Hey prompted a firestorm in the developer community by saying Apple enforces its policies unevenly.

Hey, still in beta, is charging $99 a year for access to its privacy-forward email service. Apple rejected the app on the grounds that it lacked a sign-up (i.e., pay-up) option. The Apple spokeswoman said users should be able to download an app and use it right away. The exemption granted to Netflix, Spotify and others in the “reader” category did not apply to Hey.

“Apple just doubled down on their rejection of Hey’s ability to provide bug fixes and new features, unless we submit to their outrageous demand of 15-30% of our revenue,” tweeted David Heinemeier Hansson, chief technology officer of Hey developer Basecamp.

Apple eventually approved the Hey app—Hey bent to the App Store’s rules by creating a workaround, in the form of a free trial account that expires after two weeks.

Apple’s power to determine which app makers can and can’t operate a business is now under official regulatory review. Last week, the European Union launched a probe into whether Apple violated competition laws following a complaint by Spotify, which called the App Store’s 30% commission a “discriminatory tax” that gives an unfair advantage to Apple’s in-house streaming service Apple Music.

The Apple spokeswoman said its fees are used to fund the company’s efforts to reduce spam, malware and fraud, as well as its constant review of apps for privacy, security and content purposes. She pointed to the free developer tools Apple provides, such as TestFlight for beta testing, technical support, compilers and Xcode, the software environment that allows developers to build their apps.

Mr. Barnard says he has paid $700,000 in fees to Apple over 12 years—more than his current net worth—but agreed there’s “unequivocally” a benefit to developing for Apple’s App Store: “I didn’t have to manage a web store, downloads, payments, taxes, VAT. Apple has taken so much complexity out for businesses and consumers.”

What’s great about the iPhone is, whatever Apple doesn’t build itself, someone else builds for it. Imagine an iPhone without Uber, or an iPad without YouTube. But when developers deliberately break their own apps, Apple should at least let them tell their customers why.

(Dow Jones & Co., publisher of The Wall Street Journal, has a commercial agreement to supply news through Apple services.)
 
Last edited:

TemplarGR

Dumbfuck!
Dumbfuck Bethestard
Joined
May 30, 2013
Messages
5,815
Location
Cradle of Western Civilization
The way RTX does raytracing is kinda retarded to be honest. Next gen raytracing is going to be much more efficient than that. It will also utilize cpu cores and won't run on gpus alone. Let us wait for the hardware to arrive at our hands before we jump to conclusions.
 

aweigh

Arcane
Joined
Aug 23, 2005
Messages
18,155
Location
Florida
My RTX 2070 gets around 45-50fps in Quake 2 RTX with default settings at 1080p; reducing the resolution scale to 90% brings it up to 60fps.

So far that has been my only exposure to ray tracing and :shrug: it was alright. Yeah, it looks pretty but I wasn't blown away. At all.

Hell, I think Kingdom Come: Deliverance has ten times better lighting, even if it isn't real-time like the RT lighting in Quake 2.

EDIT: Been meaning to try out Minecraft RTX to test it but I don't want to spend money on it, since I don't really have any interest in playing it.
 

AwesomeButton

Proud owner of BG 3: Day of Swen's Tentacle
Patron
Joined
Nov 23, 2014
Messages
17,133
Location
At large
PC RPG Website of the Year, 2015 Make the Codex Great Again! Grab the Codex by the pussy Insert Title Here RPG Wokedex Divinity: Original Sin 2 A Beautifully Desolate Campaign Pillars of Eternity 2: Deadfire Steve gets a Kidney but I don't even get a tag. Pathfinder: Wrath
The difference between non-RTX and RTX Shadow of the Tomb Raider is pretty serious IMO. Try that.

Also, I'm thinking Cyberpunk will play the decisive part in turning RTX into a mainstream feature, it's just a perfect playground for RTX. After it RTX will become a standard in AAA shit at least.
 

TemplarGR

Dumbfuck!
Dumbfuck Bethestard
Joined
May 30, 2013
Messages
5,815
Location
Cradle of Western Civilization
The difference between non-RTX and RTX Shadow of the Tomb Raider is pretty serious IMO. Try that.

Also, I'm thinking Cyberpunk will play the decisive part in turning RTX into a mainstream feature, it's just a perfect playground for RTX. After it RTX will become a standard in AAA shit at least.

RTX will not become a mainstream feature because it is an Nvidia gimmick. Proper raytracing will be handled by Vulkan and D3D12 when the time comes. RTX is shit, it uses too many resources for a mediocre effect.

Raytracing will never become mainstream if you have to spend 600 euros or more for barely decent 1080p performance. It's not happening.
 

AwesomeButton

Proud owner of BG 3: Day of Swen's Tentacle
Patron
Joined
Nov 23, 2014
Messages
17,133
Location
At large
PC RPG Website of the Year, 2015 Make the Codex Great Again! Grab the Codex by the pussy Insert Title Here RPG Wokedex Divinity: Original Sin 2 A Beautifully Desolate Campaign Pillars of Eternity 2: Deadfire Steve gets a Kidney but I don't even get a tag. Pathfinder: Wrath
The difference between non-RTX and RTX Shadow of the Tomb Raider is pretty serious IMO. Try that.

Also, I'm thinking Cyberpunk will play the decisive part in turning RTX into a mainstream feature, it's just a perfect playground for RTX. After it RTX will become a standard in AAA shit at least.

RTX will not become a mainstream feature because it is an Nvidia gimmick. Proper raytracing will be handled by Vulkan and D3D12 when the time comes. RTX is shit, it uses too many resources for a mediocre effect.

Raytracing will never become mainstream if you have to spend 600 euros or more for barely decent 1080p performance. It's not happening.
Just quoting to keep this post handy.
 

TemplarGR

Dumbfuck!
Dumbfuck Bethestard
Joined
May 30, 2013
Messages
5,815
Location
Cradle of Western Civilization
Just quoting to keep this post handy.

This is not 2014, so crapworks (tm) is not going to be repeated this time. The main benefit and reason to use raytracing is that it is far less work for the developer. Seriously. They don't need nvidia's "help" to do raytracing, so Nvidia "gifting" them a blackbox with nvidia optimized effects and telling them to use it instead of developing their engine themselves, like they did with shitworks(tm), ain't happening.

D3D12 raytracing already exists. It is called D3D12 Ultimate. And AMD gpus that will land in a few months are going to support it, alongside the Vulkan equivalent extensions. Also Intel is releasing GPUs soon and they are also going to be supporting D3D12 and Vulkan. So RTX is dead in the water, no developer is going to use an Nvidia-only solution unless they get "incentivized" (=BRIBED) to do so. Nvidia do have money but can't bribe everyone.
 

DalekFlay

Arcane
Patron
Joined
Oct 5, 2010
Messages
14,118
Location
New Vegas
My RTX 2070 gets around 45-50fps in Quake 2 RTX with default settings at 1080p; reducing the resolution scale to 90% brings it up to 60fps.

So far that has been my only exposure to ray tracing and :shrug: it was alright. Yeah, it looks pretty but I wasn't blown away. At all.

It's purely for tech fetishists in Quake 2 I think. It looks aesthetically worse than the original IMO, it's just neat to see a fully ray traced game from the top down for people into tech like that. Control's reflections and Metro 3's global illumination are the two that sell what the tech can do, IMO. Control looks so much better with it on I felt pretty bad turning it off so I could play in native res at 60fps.

The difference between non-RTX and RTX Shadow of the Tomb Raider is pretty serious IMO. Try that.

Someone just said this elsewhere, maybe it was you I dunno, but this is the opposite of what pretty much every tech site has said. I've watched comparison videos and rundowns on it and they all pretty much agree they look worse than normal ultra shadows due to their draw-in being too close to the camera and the effect itself being only meh. I own the game though, so I'll see myself eventually I guess.
 

AwesomeButton

Proud owner of BG 3: Day of Swen's Tentacle
Patron
Joined
Nov 23, 2014
Messages
17,133
Location
At large
PC RPG Website of the Year, 2015 Make the Codex Great Again! Grab the Codex by the pussy Insert Title Here RPG Wokedex Divinity: Original Sin 2 A Beautifully Desolate Campaign Pillars of Eternity 2: Deadfire Steve gets a Kidney but I don't even get a tag. Pathfinder: Wrath
It was me :) And I said it after watching Digital Foundry interview with the programmers who implemented RTX in Shadows, it's a 50 minute video full of examples.
 

DalekFlay

Arcane
Patron
Joined
Oct 5, 2010
Messages
14,118
Location
New Vegas
It was me :) And I said it after watching Digital Foundry interview with the programmers who implemented RTX in Shadows, it's a 50 minute video full of examples.

It's the pop-in I saw most other channels complaining about, and I saw some videos where ultra shadows had zero pop-in at all but RTX shadows drew in very obviously not far from the player. Pop-in is one of my major pet peeves with graphics, so I doubt I could enjoy them for that reason. But obviously I'll give them a shot.
 

cosmicray

Savant
Joined
Jan 20, 2019
Messages
436
You're probably in the majority though, as people tend to like sharpened images. That's why studios sharpened everything in the mainstream movie disc era. It's only now that it's becoming an enthusiast market that companies like Arrow are doing raw film scan releases.
I wouldn't mind sharping since it won't change the aesthetic much, but why on earth remastered releases are going through "teal & orange" travesty is beyond me.
 

DalekFlay

Arcane
Patron
Joined
Oct 5, 2010
Messages
14,118
Location
New Vegas
I wouldn't mind sharping since it won't change the aesthetic much, but why on earth remastered releases are going through "teal & orange" travesty is beyond me.

Teal was more prevalent in the old days than people think, because fluorescent lights show up as teal on film. DVD masters had a warm push to color grades usually, to counteract TVs of the time having super cool color temperatures. You're not wrong though that many remasters add extra teal and orange, because they think it's more "modern." Even geniuses like William Friedkin and Michael Mann have directly said they added teal to their movies to make them feel more like movies made today. It's baffling.
 

Azdul

Magister
Joined
Nov 3, 2011
Messages
3,710
Location
Langley, Virginia
The way RTX does raytracing is kinda retarded to be honest. Next gen raytracing is going to be much more efficient than that. It will also utilize cpu cores and won't run on gpus alone. Let us wait for the hardware to arrive at our hands before we jump to conclusions.
There are two parts to raytracing - tracing real rays, and extrapolating the results to simulate much larger number of rays.

The first part is embarrassingly parallel algorithm, and can be run on either GPU or CPU. In both cases it is equally hard, as it need constant access to complete geometry of the scene - it needs access to large memory structures than do not fit in low level cache. Intel used raytracing to demonstrate advantages of Larrabee architecture, because it was particularly good fit.

If you use anything else than 512 Larrabee cores used for Intel demos, you are limited in the number of rays you can reasonably trace. That's where second part comes in - extrapolating the results, and tensor cores of RTX are reasonably good at it.

What is possible without hundreds of cores and without RTX ? Probably reflections not limited to screen space on some surfaces, as seen in PS5 gameplay footage and AMD demos, and not much more.
 

TemplarGR

Dumbfuck!
Dumbfuck Bethestard
Joined
May 30, 2013
Messages
5,815
Location
Cradle of Western Civilization
There are two parts to raytracing - tracing real rays, and extrapolating the results to simulate much larger number of rays.

The first part is embarrassingly parallel algorithm, and can be run on either GPU or CPU. In both cases it is equally hard, as it need constant access to complete geometry of the scene - it needs access to large memory structures than do not fit in low level cache. Intel used raytracing to demonstrate advantages of Larrabee architecture, because it was particularly good fit.

If you use anything else than 512 Larrabee cores used for Intel demos, you are limited in the number of rays you can reasonably trace. That's where second part comes in - extrapolating the results, and tensor cores of RTX are reasonably good at it.

What is possible without hundreds of cores and without RTX ? Probably reflections not limited to screen space on some surfaces, as seen in PS5 gameplay footage and AMD demos, and not much more.

Your info about raytracing is very old and Nvidia-biased. 512 Larrabee cores are extremely weak by modern standards. Remember Larrabee cores are essential Pentium 1 cores. Without MMX. Just the original Pentium 1 core. This was an in order cpu. Its IPC is really much lower than modern ARM. So even 512 of those things are not impressive if you compare them to something like 16, 32, or 64 Ryzen cores. And we are very near the age of the 64 core desktop cpu, much closer than people think, and then everything will be raytraced.

Contrary to popular belief, a single core that can run let's say at 4ghz, is better than 4 cores running at 1ghz each. Much better. There is no comparison, really, it is always best to have a single powerful core than more weaker cores, no matter the problem you want to solve. The reason is that more processess can take turns on a powerful core and essentially run finish at the same time, but a single heavy process can't exploit easily more cores than one, there is always a cost.

Larrabee was a tech demo for Intel. Intel wanted to see if they could use their chips, stripped of the cpu-only features to save transistor area and thermals, and make a gpu. It failed. This does not mean that cpu-based raytracing will fail.

The problem with RTX, essentially, is that it is not real raytracing. It is essentially a shader effect, running on traditional raster graphics. And it is very unoptimized. AMD has filed a patent for a solution that is better and exploits the cpu cores more. So let's wait and see....
 

abija

Prophet
Joined
May 21, 2011
Messages
3,299
Dream of every gamer... 64 core cpu, 63 used for RT.
 

As an Amazon Associate, rpgcodex.net earns from qualifying purchases.
Back
Top Bottom