Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion. Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules. |
|
|
Thread Tools | Search this Thread | Display Modes |
24th September 2020, 17:51 | #1841 | Link |
Registered User
Join Date: Aug 2010
Location: Athens, Greece
Posts: 2,901
|
Using Tensor cores for madVR sounds like a joke.
A bad joke actually.
__________________
Win 10 x64 (19042.572) - Core i5-2400 - Radeon RX 470 (20.10.1) HEVC decoding benchmarks H.264 DXVA Benchmarks for all |
24th September 2020, 17:58 | #1842 | Link |
Registered User
Join Date: Oct 2012
Posts: 7,925
|
using cores for neural network operation in a video render full with neural network scaler is a bad joke...
edit: about the bug. the following thing is what i encounter and i sadly need other user to reproduce it to confirm it. i have the following issue with 456.38 the power management mode is ignored so if you are using ddu or use the clean installation option in the driver you will be stuck with optimal power. so how did i "proof" this. i contacted nvidia support because my GPU was stuck at 800 mhz with madVR playback the usually issue with optimal performance even through i changed this setting. i just followed his orders by installing an older driver and setting some very question settings in the nvidia control panel and "prefer maximum performance" as he ordered. after that "worked" i was supposed to install the newest driver again. and it works now but i'm stuck at maximum performance the GPU is idle at 1.5 GHZ. so what i like to get a report from some user here is if maximum performance works on your system and the idle GPU clock is int he GHZ range if your driver was not at maximum performance before. if you would excuses me i have to install 452.22 set it to adaptive and install 456.38. "fun" Last edited by huhn; 24th September 2020 at 18:37. |
24th September 2020, 18:55 | #1843 | Link | |
Registered User
Join Date: Mar 2002
Posts: 2,323
|
Quote:
__________________
Ryzen 5 2600,Asus Prime b450-Plus,16GB,MSI GTX 1060 Gaming X 6GB(v398.18),Win10 LTSC 1809,MPC-BEx64+LAV+MadVR,Yamaha RX-A870,LG OLED77G2(2160p@23/24/25/29/30/50/59/60Hz) | madvr config |
|
24th September 2020, 21:41 | #1844 | Link |
Registered User
Join Date: Aug 2010
Location: Athens, Greece
Posts: 2,901
|
__________________
Win 10 x64 (19042.572) - Core i5-2400 - Radeon RX 470 (20.10.1) HEVC decoding benchmarks H.264 DXVA Benchmarks for all |
25th September 2020, 12:00 | #1846 | Link |
Registered User
Join Date: Aug 2010
Location: Athens, Greece
Posts: 2,901
|
3090 is a disaster.
It needs around 375W (custom 3090 is over 400W) and 1500$ to be 10% faster than 3080 of 700$ card in 4K gaming. If 4K is a niche for most people, then 8K gaming is extra-terrestrial right now. No monitors and most games are not ready for this resolution. 8K gaming is feasible mainly using DLSS enabled games using this new "Ultra performance" mode, rendering the game in 1440p and upscaling to 8K using 9x (!) upscaling. Also, it's not a professional card. Very far from it. Steve from Gamers Nexus is not right on this one. It's a lot slower than Turing TITAN RTX in applications like CAD, Design etc. nVidia didn't enable driver optimizations of Quadro cards to a Geforce card like 3090, as it did with Titan series. The result is a much slower card even than Turing Titan RTX in these cases. Also, 3090 card has very slow FP64 performance and doesn't support GPU virtualization/sharing using SR-IOV. nVidia probably reserved these things for Ampere Titan of 3000$ So, no. Most of the users of this forum were wrong. 3090 is not a replacement of Titan cards. It could be used only by content creators using Blender or other rendering software or video editing of 8K or any productivity app workload that can't fit in 10GB of 3080 VRAM. Because the chip and the drivers are exactly the same for 3080/3090, with disabled features of real professional SW. Prosumer Titan and professional Quadro users doing CAD, Design, AI Training should wait for 3000$ Titan Ampere or Quadro Ampere to do their job. So if 3090 doesn't worth it for gaming and doesn't have professional/ workstation drivers for what is it worth ? Ask the leather-jacket-man. Or even better -gamers, prosumers, professionals could wait for RDNA2 cards. I hope this time AMD will grab the opportunity.
__________________
Win 10 x64 (19042.572) - Core i5-2400 - Radeon RX 470 (20.10.1) HEVC decoding benchmarks H.264 DXVA Benchmarks for all |
25th September 2020, 12:36 | #1847 | Link | |
Registered User
Join Date: Sep 2017
Posts: 46
|
Quote:
No management of the selected power mode. only stuck at the last value applied with the previous driver version. (where is the validation team and no regression tests database on nvidia side ? ...) It seems also that CRU is not working anymore (by the extensions blocks and displayID ) Last edited by Polopretress; 25th September 2020 at 12:42. |
|
25th September 2020, 13:11 | #1848 | Link |
Registered User
Join Date: Jul 2014
Posts: 942
|
@NikosD
Any chance you could calm down a bit and stop the misinformation? Custom 3090 GPUs don't all need more than 375W. Apart from the fact that I like EVGA support and 3 year warranty, I chose the XC3 because it only needs 2 8-pin power and draws a max of 375W. It's also smaller than the 3090FE, so I hope that it will fit in my case. 375W is in full load. It will draw a lot less otherwise. Why are you so worked up about companies strategies? They are not your friend or mine. There is nothing personal. They have a single goal: maximising profits for their shareholders. AMD is no more friendly than nVidia. Who cares who is making a GPU, as long as it's the best choice for your needs? Anyway, re your other points, I don't really play games, I don't have the time, though I occasionally play fairly old games with my daughter. Apart from madVR, I plan to use the 3090 with Flight Simulator 2020 (in 4K) and for video editing and post-production work in 4K/8K workflows (I use studio drivers). Both applications need more than 10Gb, so there are definitely some use cases for it, even if it's often more a want than a need. If people are stupid enough to buy them to play games in "8K", then it's their problem, but I don't think many do that. Currently the 3090 is the best solution for my needs, especially as I don't upgrade often. I bought my 1080ti in 2017, I have zero interest in Turing due to the lack of HDMI 2.1 and I'm hoping to keep the 3090 (if it fits in my case) for at least 2-3 years. AMD is not even on the table due to the lack of tensor cores and BT2020 flag support, so it's nice to not even care about what they might release next month. They could release a card more powerful than the 3090 for half the price, they would still not get my money When/if madVR starts using tensor cores, I want to be able to make the most of it, instead of having to upgrade once more. And being able to set the BT2020 flag is a requirement for me. I don't switch gamut or calibration manually in my cinema room, sorry. I start a film with my remote control or my iPad, and HD Fury does the rest. Sure, I'd rather buy a 3080ti with 20Gb for 30% less money, as that would suit my needs just as well, but we don't know when it will materialise, in which quantity, and the time I waste looking on forums and resellers to find out or secure a new GPU IS money. On the other hand it's quite nice to have the flagship with full bus and full memory (until Ampere Titan shows up, but I have no interest in that) and to stop thinking about upgrading for a while. I really don't care about overclocking, getting top scores at benchmarking, water-cooling etc. I managed to snatch a pre-order for the 3090 during the 1-2 hour window it was available (entirely by luck, I thought I'd try once on the 24th). I was able to place an order for the model I wanted - EVGA XC3 Gaming - at the price I was expecting to pay, knowing that I can return it if it doesn't fit my case, so I did it. My situation is slightly different than most though because it's a business purchase as I use it primarily for work, both with madVR and with video post-production. I get 20% VAT back and I save 20% taxes on company profits, so it costs me in effect 40% less than the full price. It's still expensive, but as a business expense, it's not that bad compared to a high-end laptop, projector or an AVR. I certainly wouldn't recommend a 3090 to anyone who doesn't need one and can't wait for the 3080 20Gb. I'm curious to see how much my CPU will be a bottleneck for my use case, I expect it will be. I've budgeted an upgrade to Rocket Lake next year if necessary. Again, I don't upgrade often, and I've been waiting for Intel to implement PCI 4.0 to do so, only if I need to. Buying a 3080 with 10Gb for anything but 4K video feels short-sighted to me, many games already need more than that, so to me it's a worse choice than the 3090, even if the 3090 completely loses on the value for money side. But we're all different, with different needs, priority, preferences and budgets. Couldn't you accept that what's good for you might not be best for someone else, and vice-versa, and please could you stop the AMD fanboy rants? In any case, I suggest that you start a new thread if you really want to keep going that way, because this brand war has nothing to do with the topic of this thread, and it's getting boring, frankly. I subscribed to this thread to follow potential driver issues, not to hear fanboys ranting about their most-hated or favorite brand, or to have to justify my purchase. If AMD was giving me what I wanted, I'd most happily buy AMD. I had a Sapphire 7870 before the 1080ti and I was very happy with it (until they changed their drivers and made changing advanced video settings impossible, but that's another story). I also tried a few AMD GPUs for my EGPU as they would be a better choice on my Macbook Pro when using MacOS, but they were all pants in Bootcamp so I sent them back. Anyway, thanks for considering giving us a break!
__________________
Win11 Pro x64 b23H2 Ryzen 5950X@4.5Ghz 32Gb@3600 Zotac 3090 24Gb 551.33 madVR/LAV/jRiver/MyMovies/CMC Denon X8500HA>HD Fury VRRoom>TCL 55C805K Last edited by Manni; 25th September 2020 at 16:48. |
25th September 2020, 13:31 | #1849 | Link |
Registered User
Join Date: Aug 2010
Location: Athens, Greece
Posts: 2,901
|
@Manni
I don't know if this thread is suitable for GPU discussions but the last days I have clearly stopped writing a lot of posts and certainly my posts are not so huge like yours. You are definitely an anti-AMD guy even considering buying Rocket Lake on 2021, a non-existent CPU not mentioned anywhere officially and you didn't even mention Zen 3. You are an Intel fanatic and anti-AMD fanatic. That's why you cared so much replying to me with a huge post. Also, a rather large part of your post is trying desperately to justify the worse buy ever - the 3090 card. Are you serious saying to people that you bought a 3090 card for madVR and business ? Are you serious that Tensor cores could be used outside very specific purposes like DLSS from Geforxe RTX cards ? Who is saying such garbage ? You are definitely misleading people by justifying the huge power consumption of 3080/3090 cards. There are custom 3090 cards with 420W TDP. Do you know what you are talking about ? The strategies of companies like Intel, nVidia, AMD have serious impacts in our pockets and the industry in general. We have free market system and everyone can burn his money to garbage like 3090 cards - no doubt about it. But we have to say loud and clear that the only reason to justify buying a 3090 cars is just this - because you have money. Posts like this one above make me wanna write even more about this subject. Posts from nVidiatel fanatics.
__________________
Win 10 x64 (19042.572) - Core i5-2400 - Radeon RX 470 (20.10.1) HEVC decoding benchmarks H.264 DXVA Benchmarks for all |
25th September 2020, 13:53 | #1850 | Link |
Registered User
Join Date: Jul 2014
Posts: 942
|
All right, I thought there was hope with you, I was wrong.
I'm out of this discussion. And yes, I work in film and as a consultant for madVR Labs (though what I post here are my words), so the 3090 is 100% a business purchase. MS FlightSim 2020 is just for fun at the end of the day EDIT: as for Intel vs AMD, when AMD starts supporting Thunderbolt in more than a couple motherboards, I might consider it. In the meantime, for video editing, especially with After Effects, a threadripper is a waste of money and Intel provides better performance at the same price point. I don't expect this to change with Rocket Lake, but if it does, I'll reconsider. I use Intel because it's better for my needs, not because it's Intel. I mention Rocket Lake because I want (for my needs) PCI 4.0 and Thunderbolt 4.0, and as far as I know only Intel will provide that next year.
__________________
Win11 Pro x64 b23H2 Ryzen 5950X@4.5Ghz 32Gb@3600 Zotac 3090 24Gb 551.33 madVR/LAV/jRiver/MyMovies/CMC Denon X8500HA>HD Fury VRRoom>TCL 55C805K Last edited by Manni; 26th September 2020 at 11:55. |
25th September 2020, 15:02 | #1851 | Link |
Registered User
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,407
|
Why do you assume people have to be fanatics if they disagree with you? (Projection?) I really do not think of any of these companies as entities to be fans of. The idea seems weird to me, billion dollar corporate entities are not something that I can be a fan of.
I am buying a 3090 asap, but for no good reason. A 3080 is a much better buy for my use cases. If big Navi turns out to be better than the 3090 I will buy one. Anyone without an AMD tattoo seems anti-AMD to you.
__________________
madVR options explained |
25th September 2020, 16:53 | #1855 | Link |
Registered User
Join Date: Feb 2015
Location: Bavaria
Posts: 1,667
|
See in the description of the video:
https://www.igorslab.de/en/what-real...0-andrtx-3090/ Last edited by Klaus1189; 25th September 2020 at 16:55. |
26th September 2020, 03:52 | #1857 | Link |
Registered User
Join Date: May 2004
Posts: 5,351
|
huhn you asked why I'd even consider an FE over an AIB. Given the latest reports out there, hopefully it's painfully obvious now. The AIBs, at least some of them, it looks like they cheaped out on capacitors and the crash to desktop problems we're hearing about MAY be caused by voltage problems on those cards when they boost. nVidia used the good capacitors. ASUS did, as well, which is the only other card I was even looking at. And...I think I'll stick to that. LOL Drivers are also not super awesome for these cards combined with the problems you've found about power settings sticking. This launch feels rushed even though it shouldn't have been. We know why that is, but, nVidia really didn't have any room for a screw up here. Hopefully these issues get sorted out.
__________________
HTPC: Windows 11, AMD 5900X, RTX 3080, Pioneer Elite VSX-LX303, LG G2 77" OLED |
26th September 2020, 05:47 | #1858 | Link |
Registered User
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,407
|
I think companies simply use release as a QA step today. In house QA at that level would be expensive, digital updating is cheap, and everyone who bought one can do it so the PR isn't too bad. Everyone who can does it now.
__________________
madVR options explained |
26th September 2020, 09:13 | #1860 | Link | |
Registered User
Join Date: Oct 2012
Posts: 7,925
|
Quote:
there where AIB so terrible that they would die if you flash a stock bios on them and run furmark and just to make that absolutely clear i said die not maybe die. clock rate are currently taking into account why the card is crashing. AIB cards often reach higher boost clock then FE cards just by been massively cooler with the same bios it is as it is. so just lay back wait for some test it's sadly not unusual for stuff like this to happen. someone wants an exploding EVGA card even through the VRM are totally fine? there where and there will always be terrible AIB cards too. |
|
|
|