Create an account


Poll: Are you interested in buying a GPU with RTX?
You do not have permission to vote in this poll.
Yes, if it's in a different product so that I can buy it separately.
0%
0 0%
Yes, if it's in the same product so that I must buy it if I purchase a GPU.
0%
0 0%
No, the cost would have to be 2x less.
0%
0 0%
No, the cost would have to be 5x less.
0%
0 0%
No, It's just a hype.
20.00%
1 20.00%
No, the performance is too low. It must be 10x faster to be worth it.
0%
0 0%
No, the performance is too low. It must be 100x faster to be worth it.
20.00%
1 20.00%
No, I don't play games (Lier Smile ).
20.00%
1 20.00%
I'll wait till all the bugs are worked out, then I'll buy.
40.00%
2 40.00%
Total 5 vote(s) 100%
* You voted for this item. [Show Results]

Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
POLL Do you want RTX? Cross-website-post.

#1
Smile 
Hey, I heard from 1 of the people that do HW reviews that AMD was considering implementing their very own Ray Tracing eXtensions and looking at what people think of RTX.
I got 5 votes in 23 hours at LQ Sad so I'm trying to reach a larger audience by posting here. Please vote only on 1 site.
If the RTX GPU has Linux support and some games to play would you purchase a GPU with RTX or RTX like HW? 
Bear in mind that AFAIK current Nvidia GPU's do RTX by partially ray tracing the scene (shadows only), at a low resolution. Then use an AI to fill in the missing shadow pixels and finally use the AI (through DLSS), to blow the image way up to 1080p at an astounding ~30fps (in Shadow if the Tomb Raider) Blush !!! Other (FLOSS) implementations appear to have better performance! http://brechpunkt.de/q2vkpt/
My intention is to give AMD some idea of what you guys think.
Some people who have replied thus far think RTX is marketing jargon and/or that this poll is intended to force a technology upon companies.
RTX is not jargon, it is a large portion of current Nvidia GPU's die space. You pay for it.
As for forcing, that was *never* my intention. With AMD about ~1 month ago getting freesync support merged into their driver (leaving the only major thing left being Radeon Chill AFAIK), now is a time when we are considered relevant, at least by one company and have the chance to be included in the conversation!

I'm assuming that if you don't know what RTX is that you want to do your own independent research, but for the lazy here are 2 exelent talks on the matter:
https://www.youtube.com/watch?v=SrF4k6wJ-do
https://www.youtube.com/watch?v=CT2o_FpNM4g

Feel free to comment also!

Note to voters: Most of the answers that are "no..." become "yes..." if/when Nvidia and AMD do fix the stated problem. And I have listed the common ones.
Reply

#2
We are commenting on, about, and for, *both* companies, not just Nvidia. Pick your favorite and vote with them in mind.
Reply

#3
RTX is good enough to render convincing results already. BUT afaik it renders some sample traces and fills the gaps with a trained neural network. So the results are already good enough for gaming. But it's a cheap shortcut while offering for a premium price. When it comes to new consumer-grade tech I'd wait at least 1-2 years till it's:
1) really good
2) affordable
3) has reviews of different sources

If you really want to upgrade your computer I'd go down this list until you're in your price range, do some research on similar cards there and buy the one i'm sure of being a 3-4 year worthy upgrade.


EDIT: shoud've read the post istead of assuming the content based on the title and the usual forum content but that sums up my thought process of this. I have a r9 390x 8gb ram for a while now and I still don't feel like an update is necessary any time soon also considering that gaming companies need to account for the majority not having high end stuff which makes the upgrade process slow enough.
Reply

#4
(02-19-2019, 06:00 AM)Lyberta Wrote: Then it's a bit harder to vote. I'd say when 200$ card can give 100 fps at 2560x1440 with a free driver, sure, I'd use it, otherwise, no. FPS is more important than graphical quality.

Lyberta, you seem to be here and on freegamedev, could you post a reply on only one site?
Then "No, the performance is too low. It must be 100x faster to be worth it." would be your answer because then even the cheaper cards would have great performance before you bought (It would, in time, become, "Yes, now the performance is 100x faster and the mid range cards can do 2560x1440").
Reply

#5
(02-19-2019, 06:08 AM)_para Wrote: When it comes to new consumer-grade tech I'd wait at least 1-2 years till it's:
1) really good
2) affordable
3) has reviews of different sources

1) You might select the speed increase vote.
2) Price will come down as speed increases (or you could go for the price decrease vote), you'll be able to buy from the mid tier with a 100x increase in speed. But, having an R9 390X 8G, you might go with the option of the "Yes, if it's a different product so that I can buy it separately." because that will decrease price much faster than the industry innovates the technology.
3) In my poll I'm assuming yes for the sake of simplicity.

Quote:_para

EDIT: should've read the post instead of assuming the content based on the title and the usual forum content but that sums up my thought process of this. I have a r9 390x 8gb ram for a while now and I still don't feel like an update is necessary any time soon also considering that gaming companies need to account for the majority not having high end stuff which makes the upgrade process slow enough.
Wink
Reply

#6
Yeah that sounds reasonable problem is I don't know how to edit my vote.
But faster for the current price reads like cheaper current tech in my calculation. I'm not after the latest and greatest but the one that actually fits my needs for the best price/budget.

Did that first vote mean a dedicated rtx card? I understand them like googles TPUs. Which are really good for their task but only good for that (or similar) tasks.
If it's like an upgrade card I could plug next to my card that just extends the functionality I'd buy it if the mid-high range is about 50-80€ else I don't feel like its worth it because in the end the gameplay is what brings the fun while graphics are just a bonus.

Also the last option ("I'll wait till all the bugs are worked out, then I'll buy.") is also what keeps me away. The problem here are the neural nets which are in fact really accurate but there may be some ugly edge cases which require them to be retrained. It would be even worse if the neural net is implemented as hardware.

So in the end it just seems like an over-hyped bonus-feature to me while I kinda understand how big of a step it is going from efficient screen-space-reflections to ray-tracing.
Reply

#7
(02-20-2019, 08:35 AM)_para Wrote: Yeah that sounds reasonable problem is I don't know how to edit my vote.
But faster for the current price reads like cheaper current tech in my calculation. I'm not after the latest and greatest but the one that actually fits my needs for the best price/budget.
That is typical for users. Don't think that companies don't know that. I'm that way too.

Quote:Did that first vote mean a dedicated rtx card? I understand them like googles TPUs. Which are really good for their task but only good for that (or similar) tasks. 
If it's like an upgrade card I could plug next to my card that just extends the functionality I'd buy it if the mid-high range is about 50-80€ else I don't feel like its worth it because in the end the gameplay is what brings the fun while graphics are just a bonus.
Either that or a socketed GFX card. During the node shrinks from 32nm down, yields have gotten worse and cost has increased at a greater rate than all previous nodes. A separate die, connected via PCIe, IF, or any other method and sold separately would give users a choice and a better price.

Quote:Also the last option ("I'll wait till all the bugs are worked out, then I'll buy.") is also what keeps me away. The problem here are the neural nets which are in fact really accurate but there may be some ugly edge cases which require them to be retrained. It would be even worse if the neural net is implemented as hardware.

So in the end it just seems like an over-hyped bonus-feature to me while I kinda understand how big of a step it is going from efficient screen-space-reflections to ray-tracing.
NNs (Neural Nets) and ray tracing HW are very different; I'll explain, but I make no guarantee you won't be  Huh .
NNs decide on what "is", based off looking for patterns which are initially really small. A child says "Mommy, look in the clouds is a dragon!" "Yes, dear," the mother responds thinking of how ridiculous it is for her to be expected to find the dragon in the whole sky of clouds based solely on a small child's finger pointing a rough direction.
How does the child do that? They see a mouth, then an eye, then a neck, the next part hardly looks like a body, but who cares? They have found their dragon!
Likewise, upon seeing a tire and a window an AI might conclude that it has found a car, when what it is viewing is a picture of an automotive shop. The developer then marks the AIs answer as wrong and runs the AI on many, many car shops in hopes that the AI never makes that mistake again. Much like the mother wishes that her child would "grow up" and stop seeing things in the clouds.
That was very simplified, but I hope you could follow it. It is a fairly accurate picture of what happens in an AI algorithm.

Ray tracing, OTOH, uses a fixed algorithm to trace light from a source to it's eventual destination. There is no attempt at exacting larger details out of smaller ones. There is no need for "training" the ray tracing HW.

However, it is worth noting that Nvidia does rely on it's AI NNs to a large extent when rendering using RTX.

Huh  or  minstanex ?
Reply

#8
I wouldn't be a fan of sockets. Maybe if it becomes the standard for all boards in about 5 years like soundcards.
While dedicated cards would be stupid in the sense of them only properly working with hardware of the same company and sockets may also on motherboard level. On that end nvidia really seems to be a dick.
But I'd still rather wait for better solutions. I think in some years the graphics cards are worth the hype of now.

No worries I kinda understand NNs but thanks for explaining. (on my github below is a simple backpropagation one in lua and some comments)
Reply

#9
I plan to live with my current 1080 GTX for 4-5 years, then I'll see what's there to update it.
Reply

#10
(02-20-2019, 08:34 PM)Lyberta Wrote: And NNs are not just used for classification generally.
Obviously I giving a simple explanation that I think can be easily understood. I am not an AI scientist. I have not read all the various papers, and books. I certainly don't have access to budding research in the field like Googe's deepmind AI.
Reply

#11
(02-21-2019, 10:48 AM)_para Wrote: No worries I kinda understand NNs but thanks for explaining. (on my github below is a simple backpropagation one in lua and some comments)

Ha! I've only been doing a lot of reading on the subject and you have a whole implementation!
Reply

#12
(02-21-2019, 06:12 PM)morosophos Wrote: I plan to live with my current 1080 GTX for 4-5 years, then I'll see what's there to update it.

The whole question was intended to foster a conversation such that I could direct it to AMD and say, "Here's what the Linux community thinks of RTX." (As I [tried to] made clear in my initial post.)
In 4-5 years they would have already made up their minds on the whole RTX debate and we'd be out of luck if we see things differently.
So do please vote. It's not a commitment to buy a GPU any time soon.
Reply

#13
The poll should end tonight (in about 6 and 1/2 hours).
If you did not vote yet you still have a chance.
Reply

#14
I would buy a GPU which surpasses Doom Eternal high requirements with all the high settings set. But I'll wait for a new "id Software" game to come that may exceed the high requirements that are needed. I liked how Doom Eternal and the quick FPS is developed for the experimented shooter players. The GPU of this will be useful for Unity game engine development projects. Smile
Reply

#15
https://forums.xonotic.org/showthread.php?tid=8761
Relevant thread, posted that and found this looking for more info on the possibility. On the matter at hand: I have the Radeon XFX R9 390 at the moment, a somewhat older GCN 2.0 card but which supports Vulkan. If I'm not mistaken, Vulkan support automatically means GPU raytracing is possible in some form? If not I may have to wait for RTX GPU's to fully establish their architecture and have their prices drop a bit, then hope I'll have the money to get one with an even better implementation someday.
Reply



Possibly Related Threads…
Thread Author Replies Views Last Post
  New FPS fan website MarisaG 4 5,923 09-08-2018, 01:21 PM
Last Post: MarisaG
  Cross platform games with dedicated server components? end user 5 6,436 11-24-2015, 02:29 PM
Last Post: end user
  I Have Now a Website on Enjin Eonman 3 5,574 05-14-2015, 04:15 PM
Last Post: Eonman
  You know you've played too much Xonotic when... [AP]Sniperz 45 46,041 10-14-2014, 05:47 PM
Last Post: asyyy
  can i delete my own post? It'sMe 7 9,777 05-02-2014, 03:40 PM
Last Post: Mr. Bougo
  You'll thank me when you read click on this thread...trust me .Danny. 20 28,411 08-06-2012, 12:45 PM
Last Post: Mr. Bougo
  what would you do if you were in my situation with this laptop jacob 13 12,175 02-05-2012, 05:50 PM
Last Post: tZork
  1337 Post Lee_Stricklin 24 21,553 01-18-2012, 07:10 PM
Last Post: Lee_Stricklin
  FREE MUSIC - post yours! The mysterious Mr. 4m 4 6,888 02-08-2011, 07:02 PM
Last Post: lopho
  Aylonic: Game Engine? POLL SkyHookers 5 7,238 11-14-2010, 06:57 PM
Last Post: CuBe0wL

Forum Jump:


Users browsing this thread:
1 Guest(s)

Forum software by © MyBB original theme © iAndrew 2016, remixed by -z-