Ai Dreams Forum

Member's Experiments & Projects => General Project Discussion => Topic started by: frankinstien on December 17, 2021, 09:47:07 pm

Title: AMD GPUs Support GPU-Accelerated Machine Learning
Post by: frankinstien on December 17, 2021, 09:47:07 pm
Well the RTX 3080 for $800 bucks ended up being a fraud where eBay promptly refunded me. There appear to be lots of fake high-performance GPUs for sale. eBay actually started the return process for me from the seller I purchased from, obviously a repeat offender! So I tried another seller on Amazon where they were selling it for $1288.00, however, the package conveniently has been delayed at LAX after passing customs for 5 days now for some reason. Some on Amazon have complained about the seller and they never received the GPU. The carrier is 4PX, maybe they're in cahoots with the seller and 4PX is just posting shipping detail updates to fool me? In any case, if I don't get the GPU by today I get a refund from Amazon. So, now it comes to affordability and the AMD high-end GPUs are looking pretty interesting at $1600 or just under. The problem is no Cuda support but AMD and Microsoft have an alternative you can read about it here (https://community.amd.com/t5/radeon-pro-graphics/amd-gpus-support-gpu-accelerated-machine-learning-with-release/ba-p/488595).

I really wanted to go with Nvidia this time out, I could go with a lower-end GPU like an RTX 3070. The strong point of the AMD Radeon RX 6900 XT is 16GB of DDR6 ram and theoretical performance of 46 TFLOPs for 16 bit precision (https://www.techpowerup.com/gpu-specs/radeon-rx-6900-xt.c3481)! Where the RTX 3080 theoretical performance of the same precision is 29.77 TFLOPs.
Title: Re: AMD GPUs Support GPU-Accelerated Machine Learning
Post by: infurl on December 18, 2021, 12:29:58 am
It would be very difficult to find a good GPU card at the moment with all the idiots scrabbling for cryptocurrency and the scammers who are preying on them. Have you considered getting GPU in the cloud? I've had good experiences with Linode and can recommend them if Linux is your thing. You can get an RTX6000 which is roughly equivalent to the card that you're trying to buy for $1.50 an hour or $1000 per month. You can have up to four of them per node.

https://www.linode.com/products/gpu/

https://technical.city/en/video/Quadro-RTX-6000-vs-GeForce-RTX-3080-Ti

I expect there are lots of options to choose from elsewhere but based on my experience they are likely to be more expensive than that.

https://thechief.io/c/editorial/comparison-cloud-gpu-providers/
Title: Re: AMD GPUs Support GPU-Accelerated Machine Learning
Post by: MagnusWootton on December 18, 2021, 11:19:47 am
no CUDA support?  I dont understand that.
I think the way I'm doing my GPU code is getting phased out,  doesn't bother me tho entirely, I'm shipping over to FPGA's from now on,    the GTX980 is going to be my last GPU.

I might just get my last old computer (with the 2048 cuda cores on it.), whack some legs on it - and I can repurpose it as a robot, in ancient DX11. :)
Title: Re: AMD GPUs Support GPU-Accelerated Machine Learning
Post by: infurl on December 21, 2021, 10:51:32 pm
https://spectrum.ieee.org/ibm-mainframe (https://spectrum.ieee.org/ibm-mainframe)

Quote
WHENEVER I HEAR SOMEONE RHAPSODIZE about how much more computer power we have now compared with what was available in the 1960s during the Apollo era, I cringe. Those comparisons usually grossly underestimate the difference.

According to these calculations the progress made in 60 years is far greater than most people realise, especially when you factor in GPUs.

edit: Haha I really should have finished reading the article before I posted it.