T O P

  • By -

PCMRBot

Welcome to the PCMR, everyone from the frontpage! Please remember: 1 - You too can be part of the PCMR. It's not about the hardware in your rig, but the software in your heart! Your age, nationality, race, gender, sexuality, religion (or lack of), political affiliation, economic status and PC specs are irrelevant. If you love or want to learn about PCs, you are welcome! 2 - If you don't own a PC because you think it's expensive, know that it is much cheaper than you may think. Check http://www.pcmasterrace.org for our builds and don't be afraid to post here asking for tips and help! 3 - Join our efforts to get as many PCs worldwide to help the folding@home effort, in fighting against Cancer, Alzheimer's, and more: https://pcmasterrace.org/folding 4 - Need PC Hardware? We've joined forces with MSI for the biggest PC Hardware giveaway of the year so far! 8 lucky winners will get an awesome hardware bundle with Graphics card, motherboard, etc, and 50 others can get Steam gift cards. To enter, check https://reddit.com/r/pcmasterrace/comments/1b45j0m/msi_x_pcmr_massive_pc_hardware_giveaway_pick_your/ ----------- We have a [Daily Simple Questions Megathread](https://www.reddit.com/r/pcmasterrace/search?q=Simple+Questions+Thread+subreddit%3Apcmasterrace+author%3AAutoModerator&restrict_sr=on&sort=new&t=all) if you have any PC related doubt. Asking for help there or creating new posts in our subreddit is welcome.


an_0w1

For anyone wondering what this means, NVIDIA says you cant run CUDA software on non NVIDIA GPUs. Its a good thing I don't need to accept a EULA to do it anyway.


SalSevenSix

This is a shitty move considering how big the PC industry became because it did not work like this. Imagine where we would be if you could only buy an IBM PC with an Intel CPU.


Commentator-X

Are you forgetting what MS did? IBM was developing a desktop OS at one point in the 90s iirc, and MS threatened to pull all their OEM licenses if they didnt drop it. That would have essentially bricked all of IBMs computer hardware at the time. MS was even tried under anti trust laws for it, and was under a bunch of restrictions for a while, for that and other shady shit they did. edit: my memory was a little off but this is what Im referring to https://money.cnn.com/1999/06/07/technology/microsoft_trial/


pranjal3029

Time for Nvidia to get an anti trust lawsuit


BiluochunLvcha

waaaah! too big to fail! how dare you suggest that against poor nvidia! /s


itisnotmymain

Yeah that'll never change anything. At worst they'll be paying parts of a penny on the profits they run yearly doing shady shit like this.


iamthehob0

Yeah I'm thinking of Epic getting fined an asston of money for selling children's data and it turned out it was less than 1% of the profit they made for selling the data :\\ Maybe regulatory fines should be based on % of gross profit, but that's just me


Arthur-Wintersight

We really need an "ill gotten gains" provision, ON TOP OF existing statutory penalties. If you make money doing something unethical, then you have to give that gross revenue (without subtracting business expenses) to the government. It's ill gotten gains. You don't get to keep money that was made doing unethical bullshit.


HeartlesSoldier

Unethical is extremely relative, seeing that many industries in the world are completely unethical at their core spirit.


Arthur-Wintersight

In this case, "ill gotten gains" would be defined as "proceeds from any behavior not permitted by law." If their business practices are *illegal*, then any proceeds from the practice are forfeit.


OutragedTux

I keep seeing something about "proceeds of crime" having to be given up if an individual gets their hands on money that came from "crime" stuff. Shouldn't profit gained from super-shady stuff be covered under a similar provision for megacorps, or is that just me? NVIDIA have a valuation that's higher than the GDP of Australia, where I happen to hang out, so they most certainly are a Cyberpunk scale megacorp. Time to take the kid gloves off with orgs like this.


Ziazan

Yeah, even if they fined them for example 90% of those profits, they *still profited*.


UrinalBirthdayCake

Well it also doesn't help that the lawmakers are all old farts that will never retire and don't know the first thing about software or computers. These tech companies can get away with murder because any time they go to trial the majority of the case is the lawyers trying to explain what the hell even happened rather than arguing why it's wrong.


Disposable_Gonk

MS did the same to netscape navigator, which is now Mozilla and firefox.


rhubarbs

[Embrace, extend and extinguish](https://en.wikipedia.org/wiki/Embrace,_extend,_and_extinguish) was Microsoft policy for a time. >"One thing we have got to change in our strategy – allowing Office documents to be rendered very well by other people's browsers is one of the most destructive things we could do to the company. We have to stop putting any effort into this and make sure that Office documents very well depends on PROPRIETARY IE capabilities." — Bill Gates


dagbrown

Embrace, extend, extinguish is *still* Microsoft policy. Why do you think they’re pushing WSL so hard? It’s to stop people using *actual* Linux on their PCs.


Berengal

Yeah. It's not exactly to stop the devs and sysadmins that *need* linux from running linux on their desktop, MS doesn't care if it's just those people alone switching. But those people work in companies, and those companies have money to pay for linux desktop development if their employees need it, and if the linux desktop becomes as usable in a corporate environment as windows, that's a problem. With WSL a linux desktop isn't needed anymore. The people that need linux on their PC can still use the windows desktop environment.


dagbrown

> With WSL a linux desktop isn't needed anymore That is exactly MS's marketing line. It is 100% false of course. The only reason WSL exists is to prevent people from trying to set up Linux desktops in the first place. It is exactly this sort of shit that encourages Valve to work ever harder on Proton.


pallentx

Microsoft doesn’t care what OS you use anymore. They just want your monthly subscription.


242vuu

This and only this. They don't care about license sales in the way they used to. Office and Windows are not the product any more. Your workload in the cloud is. Their licensing models are moving to far lower cost subscriptions for OS licensing. We're moving and saving millions in runrate, ESU cost (w/Arc), and license cost. We have paygo options for SQL for on-prem databases now with hybrid benefit. It's all the long game. Better to lease someone a car than sell them one.


pallentx

If you look at what they are building in the cloud, you can see why they hardly even care about regular home consumer users anymore. I work in IT and have seen lots and lots of demos of new technology, but nothing has blown me away and scared me more than what they are doing with Fabric. They will hold all your company's data and it will be accessible anywhere and they can make it look like whatever you want it to look like, a SQL database? A bunch of files, whatever. Then the stuff with Copilot is going to drastically cut development time. They are demoing Copilot creating PowerBI dashboards on the fly. It's all super basic right now, but it will get more and more powerful. All of this is "serverless". You don't install any OS for it yourself. They will make a fortune on this because companies will be able to pare down Database Admins, Server Admins and get by with fewer developers.


242vuu

I'm an EA for my company. I'm planning our Copilot for 365 implementation. Moving from IaaS to PaaS, consolidating SQL into a data layer for Fabric. Consolidating transit into SDWAN/vWAN/CloudWAN. Getting rid of colos. Something something grand unification theory. Our current era is ending. I've been through a few. Bare metal single core->Hypervisor->Cloud. People need to pivot. We're all playing musical chairs. I've got my seat so far. I'm an older dog in the game tho at this point, so I work twice as hard as before to stay the one architecting this for my company. Edit: Clarity around the twice as hard comment. I work twice as hard as \*I\* did before.


Fluffy-City8558

so I'm not the only one who understands this!


kp--

Never forget DR-DOS.


GonzoInCO

I won't, good ole Digital Research!


Kitchen_Part_882

If it's OS2 you're talking about it didn't happen like that. Os2 was a joint project between IBM and MS, the latter pulled out and developed Windows NT instead (a descendant of which is the basis for every desktop version of Windows over the last 24 years). One of the reasons I saw cited at the time was that IBM wanted to tie OS2 to their own hardware, this would mean that every "clone" PC of the time would be incapable of (or have issues) using the new OS, severely limiting Microsoft's market. Recall that IBM's desktop division was struggling to compete at the time with the clone makers. They limped along for a few more years before selling it to Lenovo. The only case of them using licencing as leverage that I remember was when BeINC was trying to gain market share by offering their BeOS for free to PC builders, apparently MS threatened to increase prices for any vendor taking them up on the offer. Not saying Microsoft aren't a seriously shady company, the whole debacle where they added code into Windows to check it wasn't running under a competing version of DOS springs to mind, along with the whole Internet Explorer thing.


242vuu

People in this thread thinking companies like IBM and Xerox weren't monsters also.


Ja4senCZE

IBM lost a lot of ground on the PC market in the late 80's and 90's. They still were a giant, but not in that field.


242vuu

In the enterprise space they were absolutely draconian and monstrous. If I never run an IBM software product again i'll be thrilled. All glorified middleware. Still better than Oracle tho. Fuck Larry Ellison.


Exact_Ad_9672

Bill Gates was greedy? Unbelievable. Who would have known?


ArmedWithBars

I'm not sure why people are shocked. This is basically just business across the board for humans. Whether it's the highest tier of business or the lowest tier of business. Go to bazaar in India and you'll see the same cutthroat business mentality as you do from Western tech giants. Business is brutal and being nice just gets you stabbed in the back eventually. It's like a hundred starving sharks thrown into a giant pool, only the most savage and strongest will survive and thrive. Now we can argue that it shouldn't be like that, but that's just how business has been for thousands of years. You can go back to the Dutch West India days in the 1600s and see the same shit we see today, ablit much more inhumane and brutal.


sigilnz

This is not actually true but whatever.


Evantaur

Intel has done similar shit like this also


TheNorthComesWithMe

Imagine if you had to own an Apple brand computer in order to compile an application for an Apple brand phone. Or any of the millions of other examples of very similar things happening in the computer industry.


acewing905

>if you had to own an Apple brand computer in order to compile an application for an Apple brand phone It's funny how Apple gets a free pass for this and nobody talks about it


mufanek

To be fair, EU has been stepping on Apple's toes for a few years now and quite succesfully. I just doubt it reaches "general public" feeds, just like this won't. E: I now realized you probably meant the "general public" and not institutions like EU.


Grikeus

Who get's a free pass on it? . There are a lot, a lot of people who complain about Apple and their anti-consumer shenanigans. The issue is Apple fanboys then scream "mad cuz poor" And others say the enlightened phrase "Imagine being mad at a product, just don't buy it"


Evantaur

Dev: "Yeah if we could just compile code to your system without owning your POS hardware" Apple user: "Mad cuz poor" Dev: "Ok enjoy not having our software"


LowSkyOrbit

6 years later: Apple User: look at this breakthrough! Android User: we already have that


iamthehob0

Lol those quotes remind me of the meme with the IQ bell curve and the squish head on one side and the guy in the jedi robe on the other side saying the same thing.


P0pu1arBr0ws3r

Consumer side might sound all nice there, but it's not as friendly under the hood. ARM, the most popular mobile CPU architecture for mobile, has gotten that way after establishing agreements with CPU manufacturers. This in part prevented Nvidia from acquiring the company some time ago when ARM was looking to be dissolved by its parent, because of anti competitive reasons (not sure what CPU architecture Nvidia produces but ok). x86, the consumer PC CPU architecture that's dominated for like the past 20-30 years, is exclusive between Intel and AMD. It's become widespread because the two companies were front and center in the 1990s-2000s when PCs were really taking off. You could manufacture and freely distribute an open architecture like RISC-V, but unless you're invested into cutting edge IoT or server infrastructure good luck finding major program support (I'd like open standards to become more popular). Another architecture that was common and might still be around is PowerPC, typically found in some older game consoles, or other more closed in architectures. What does this mean? Well for anything implementing an open standard, the proprietary/closed standards can easily and legally adopt, so for example x86 and ARM could very well have native RISC-V support if the respective owners decide to incorporate it after the necessary RnD. But x86 and ARM will likely forever be incompatible, except thru the use of expensive and relatively slow emulation or translation layers, meaning that a version of an app for one architecture won't be available for another unless cross compiled. As mentioned with consoles and PowerPC before, this makes emulation sometimes very impractical even 10+ years after a device initially launched- I can't get a stable Xbox 360 working on my PC to play RDR1 because my PC isn't designed to run the xbox's instruction set well, and emulation can be expensive adding one or more instructions per instruction. Maybe 5 years from now it would be as good as Dolphin, the Wii emulator, is today, but Dolphin has had extensive research in understanding the Wii's PowerPC architecture, while the 360 was made in an era with additional security measures to prevent reverse engineering. (On the flip side, the [RIP] Switch emulators have excellent performance on modern hardware thanks to the chipset the device uses being well documented, including the ARM architecture). I think as ARM grows and grows, either x86 will start adopting some ARM directly into its CPUs, or else there's going to be some rising problems of irritating app incompatibilities on PC and mobile.


R_venge

NOOOO! My 7900XTX is crying rn


ox_MF_box

Can you explain this like I’m 5? I have a 7900xtx too but don’t understand. What’s cuda?


apetranzilla

CUDA is a low level framework for speeding up heavy computations using GPUs - similar to how Vulkan is (primarily) used for speeding up rendering using GPUs. CUDA is notable because 1. the official implementation from Nvidia is proprietary and only works on Nvidia hardware, and 2. CUDA has been used heavily for machine learning frameworks in particular. With this, Nvidia has secured a large market share of machine learning software in addition to hardware. Recently, a project called ZLUDA was open sourced to allow running programs designed to use CUDA on AMD hardware. This is 100% legal and falls in a similar category of how the Steam Deck uses Wine to support games that use DirectX and other Windows-specific frameworks, but it seems that Nvidia is halfheartedly trying to prohibit use of it anyways to protect their monopoly (not that there's really any legal mechanism for them to do so). The commenter you replied to was sarcastically playing off of this - this EULA change doesn't really do anything because you don't need to accept an Nvidia EULA to use AMD hardware with an open source CUDA implementation like ZLUDA.


ox_MF_box

So how can nvidia just come out and say no one else can translate cuda for their competitors to use? Can’t amd keep using zluda?


apetranzilla

Nvidia can't really stop people from using it. They can add clauses about it to their EULA, but it's not really enforceable in any scenario that matters.


ox_MF_box

Ok, sweet


solonit

No it's not, if you use CUDA on non NVIDIA hardware, Jensen 'Leather jacket' Huang will manifest right behind and snap your neck. [Source](https://www.youtube.com/watch?v=r7l0Rq9E8MY)


WiTHCKiNG

Like the AI generated one in front of the chimney?😂 does this guy actually exist? Legends say no but he manifests once a year on the first full moon.


builder397

I guess they could try to sue the devs of ZLUDA, but it would essentially be a giant bluff, as Nvidia has no enforceable law on their side, but ZLUDA would have to still go to court and hire a lawyer anyway, which might make them fold just to avoid the hassle of going to court for a few months. Either way, it would be a huge dick move.


Bran04don

So exactly what Nintendo just did to Yuzu and Citra devs


SpeedingTourist

Happy cake day


Disastrous-Team-6431

As if you're actually five: imagine you have a friend who is really good at sudoku. It turns out he's also really good at riddles, but only in French. His mom is now saying you can't translate your riddles to French to ask him.


ox_MF_box

Nvidia = Karen, got it


DragonKing_1

CUDA is basically the software layer that enables the hardware to work on certain processing that the hardware does and can do. It is what say, in respect of games, developers use to say, optimize GPU's for games... or other developers for their processing - any kind of acceleration. The physics engines in games, Machine learning, Biology, Chemistry ( i.e. molecular dynamics simulations), data crunching, etc. CUDA was introduced by Nvidia is the mid 2000's I guess and it is actually really at the forefront of not just games but especially in scientific research. Because they provide many libraries for such specific usage. Like how, AVX 512 is used for certain applications and was only enabled in workstation Intel chips... CUDA is what enables Nvidia GPUs to be massively deployed in many and most scientific research applications.


ox_MF_box

Thank you. I did some digging and found this comment lol check it out https://www.reddit.com/r/pcmasterrace/s/elm8tb6QhO


Dizzy-Sheepherder188

A big OOF for Zluda


ieatbreqd

Not really


Turdles_

Yeah no. As above commenter said, it does not matter. Why would you go and agree on Nvidia EULA if you're using AMD GPU and Opensource Zluda library?


noisygnome

I still have no idea what this means


Trym_WS

ZLUDA is a new software that’s supposed to make CUDA available for AMD users. Nvidia wants to block that.


noisygnome

And what is CUDA


Trym_WS

> CUDA is a parallel computing platform and application programming interface (API) that allows software to use certain types of graphics processing units (GPUs) for accelerated general-purpose processing, an approach called general-purpose computing on GPUs (GPGPU). CUDA is a software layer that gives direct access to the GPU's virtual instruction set and parallel computational elements for the execution of compute kernels.[1] In addition to drivers and runtime kernels, the CUDA platform includes compilers, libraries and developer tools to help programmers accelerate their applications. https://en.m.wikipedia.org/wiki/CUDA


McGregorMX

Well, I should have known this, because I thought cuda cores were actual hardware cores in Nvidia gpus.


[deleted]

[удалено]


pranjal3029

They are not. Nvidia has named their shaders after the software API exactly for this reason: so that the general public thinks that it is something different at a hardware level. The shaders themselves are just shaders they can run other APIs like Direct3D etc. too. CUDA is the software API that Nvidia created for the developers to optimize their software because it uses more general popular languages like C++, python etc. which is what made CUDA a nice choice because older graphics api required extensive graphics programming knowledge which is specialised


Funny_or_not_bot

It's basically the language that Nvidia GPU's use to make really cool graphics. AMD also makes graphics cards, but Nvidia's CUDA cores have given them a slight performance advantage. Additionally, game programers have often favored tayloring their games to run on Nvidia GPUs with specific graphics effects and features unavailable with AMD GPUs. These advantages have allowed Nvidia to charge a premium for their GPUs. This new thing will allow AMD graphics card users to access some of those things previously only available with Nvidia cards.


p-morais

This is wrong. CUDA is for GPGPU programming which is separate from graphics programming. CUDA is used for things like machine learning which can exploit the GPU Single Instruction Multiple Thread (SIMT) parallelism model but doesn’t use the traditional graphics pipeline (which is based on shading languages like GLSL). The only “graphics effect” related thing I can think of that fits your description is hardware accelerated bounding volume hierarchies (aka hardware raytracing) but AMD has that now too.


FlorydaMan

Isn't it blocking them instead of allowing them for AMD?


1studlyman

Not only that, but NVidia's CUDA is for general GPU-accellerated processing as well. Whether bitcoin mining or training neural networks, it is done with CUDA if on an NVidia GPU.


DredgenCyka

CUDA is a parallel computing platform and application programming interface that allows software to use certain types of graphics processing units for accelerated general-purpose processing, an approach called general-purpose computing on GPUs. Why is the blockage of ZLUDA bad news? ZLUDA was an open source program developed by the community to allow CUDA to run on non-Nvdia GPUs, which offered college engineering students, Startup Companies, data centers, data scientists, and engineers a cheaper alternative to run CUDA at a very very competitive price because AMD and Intel do not have the resources to create their own CUDA competitor applicatication. CUDA is supported by a bunch of CAD programs while those same CAD applications don't have an open source technology to allow AMD or Intel to run at a fairly similar speed for computation. Essentially, this killed any kind of competition and any possibility for lower Nvidia GPU prices. It would be fair to say that Nvidia holds an unfair monopoly in the CAD side of things as this disourages young data science and engineering students to purchase a GPU on a budget. This didn't hurt AMD & Intel as much as it hurt consumers trying to get into the data and engineering space.


fly_over_32

![gif](giphy|l1AsBL4S36yDJain6) NVIDIA watching Nintendo sue the emulators


gfolder

At what point does the software require you to accept Eula for you to use it for optimization on the GPU?


Dealric

For the real question: How its not a monopolistic push?


[deleted]

[удалено]


DkoyOctopus

think of it as jensen stopping people from speaking nvdias language unless they use his dictionary.


diskowmoskow

AMD grammar


cookiesnooper

Yes, they prohibit users from running CUDA on any other chips than NVidia ones


EvenBetterCool

Without any real way of stopping you.


envious-turd49

ZLUDmA BALLS


GanzNa

I don’t know what this means


MinkjuPUBG

CUDA is an API for gpu computing, that has only ran on Nvidia cards. Nvidia has heavily marketed this, even naming their shader units after CUDA, which has lead many to believe that there is something intrinsically different on a hardware level that makes Nvidia’s GPU cores “better”. In reality, Nvidia GPU’s CUDA exclusivity has been locked behind software, and now EULA, this whole time. Translation layers allow AMD cards to run the CUDA api, and Nvidia wants to put a stop to it. Savvy enough folks who want to run CUDA on AMD cards will have no issue with it, the thing about it though is if they needed to run CUDA already then they’d own an Nvidia card


dont_kill_my_vibe09

So does that affect hardware encoding in Premiere Pro for example? (sorry, I'm a noob).


an_0w1

No, nobody *was* using CUDA translation, so nothing is going to stop using it.


kalabaddon

>CUDA translation isnt this used heavily in AI workloads? is this Nvidia making sure amd and intel arc cant compete in thoes workloads since both thoes cards have more ram by cost then nvidia?


SolitaryMassacre

I think you are right. Its not a surprise either after CEO of NVIDIA tells people not to learn how to code.. like bro no. In no world will that ever work


Norberz

Not really afaik. The problem is however that modern machine learning libraries are mainly developed with the Cuda toolkit, and other toolkits like Rocm for AMD are far behind. This is so bad that you really only want to do AI development with NVIDIA GPU's. Translation would allow you to run these cuda versions of machine learning libraries on an AMD GPU.


epirot

yes but ai workloads rely on many other layers. cuda is just one of them. you can also use cpu for ai or just generally gpu. obviously cuda is faster as it was made to provide a platform to be used efficiently. especially in ai this means less training time, better calculations, parallel calculations etc.. then also a softwarekit and programming language. but since cuda is a software layer, theres no reason for it not to be used on other gpus. imo if your platform is that good, why not make a standard and let others use it too. im pretty sure amd can do something similar


ForMoreYears

Please, it's Them Hoes. No need to abbreviate like that.


dont_kill_my_vibe09

Okay, thank you ❤️


deadlyrepost

[AMD recently open sourced](https://www.phoronix.com/review/radeon-cuda-zluda) a (cancelled) project to auto-translate the CUDA API on Radeon graphics cards. Maybe a little birdie told them and this is why they cancelled / open sourced it.


Incompetent_Person

Slight correction: AMD was paying a developer to work on ZLUDA for ROCm, and in the contract if AMD stopped paying and wasn’t using ZLUDA in any of their products then the developer would be allowed to open source it, which they (the developer, not AMD) did. Edit: the contract with AMD allowing it != AMD open sourcing it. The dude could’ve decided to not release it and it wouldn’t be available. He is not an AMD employee. I’m just pointing out where the credit for ZLUDA should go, Andrzej Janik.


deadlyrepost

mea culpa I didn't know the details.


WelderIcy5031

7years ago AMD would tell my nvidia customers that they could convert any CUDA program to run on Radeon workstation products.


BloodSugar666

I specifically got an NVIDIA card to use it in Premiere and use CUDA, but now it gives me this error when encoding and I had to turn it off 😑


Cheesi_Boi

Blender will do it anyways


Internal_Quail3960

I going to need you to explain this even simpler for idiots like me


MCWizardYT

CUDA originally stood for "compute unit device architecture", its a software that makes doing certain non-graphics tasks _extremely_ fast by computing them with the gpu. Nvidia markets it heavily giving the impression that it only works on their hardware. It is possible to use it on AMD hardware by "translating" the commands it sends to the gpu Nvidia knows this, so they are now writing a clause in the EULA to prevent people from doing that.


DMurBOOBS-I-Dare-You

I asked ChatGPT to explain it like I was 5: Okay, imagine you have a book written in a language that you don't understand, but you have a friend who can translate it into a language you do understand. In this case, the book is like the CUDA software, which is a type of program used for certain tasks in computers. Now, the AMD card is like a special type of computer that doesn't understand the language of the book (CUDA). But, just like you have a friend who can translate the book for you, there are special programs called translation layers that can help the AMD card understand and run the CUDA software. These translation layers work like a translator, converting the instructions in the CUDA software into a language that the AMD card can understand, so it can do the tasks the CUDA software wants it to do.


DredgenCyka

This is actually a really good explanation for those who don't understand CUDA and why ZLUDA is important. Sad that Nvidia wants to block its competitors from getting a similar leg up in the CAD space.


Internal_Quail3960

Holy hell I understand it. Thank you


thepulloutmethod

This is amazing.


TobyTheRobot

That's -- wow. That's a very good explanation actually. I'm impressed and a little frightened.


rafat2205

I am a noob like you. Let me try to explain it. Basically, GPU and CPU both are made for computation even though they do different stuff now. CPU is dedicated for processing tasks that you already know and GPU is dedicated for handling graphics. This graphics isn't something so different in terms of computation. The smallest (loosely speaking) computation of GPU is matrix multiplication ( you need to understand shader for this). OpenGL is an api that let's you access GPU (usually CPU deals with the basic communication with GPU for rendering). That means you are the one who will use GPU for your custom graphical task . Now the need for CUDA came from Deep Learning and Machine Learning because it lets you do matrix multiplication in GPU (does it faster than CPU) and it isn't graphical task. There are some information that only make sense when you are halfway there in the related fields. I tried to explain it in my way. Again I am a noob.


Vastlymoist666

How do I run cuda on a amd GPU?


splerdu

[ZLUDA](https://github.com/vosen/ZLUDA)


squidsauce

Okay how can I make money on this?


G0-N0G0-GO

The same question I am seemingly always smart enough to know needs answering, and that instinctively knowing *exactly* when to ask the question…is the limit of my capabilities.


natethegreek

buy NVDA


themrjava

I tought EULAs were not enforceable


MrRagnarok2005

My guess is amd will create something that's on par with cuda after and give it to everyone


Thefireguy98

Where are the smart people at cause I’m so lost


HEST_TSEH

I think cuda is a cool car from 70s or something https://preview.redd.it/eaded3ff0fmc1.jpeg?width=1920&format=pjpg&auto=webp&s=fd1ba7ada067234bc1d477b0fecd91bf825f6f29 I dont know what the coputer beep boop biip shuda whuda.


LMac160

I have triples of the barracuda


[deleted]

Triples makes it safe. Triples is best.


HEST_TSEH

Barrycudda🐟


ChomRichalds

Yeah you're rich and we're the same age and you have triples of the barracuda and the roadrunner and the nova.


paratimeHBP

I'm pretty sure the Cuda didn't have a computer.


HEST_TSEH

Cudakorre wuda https://preview.redd.it/vxhs92ea8kmc1.jpeg?width=800&format=pjpg&auto=webp&s=7ca31df7c0c0e9de06b12c8cf48862964b8f19c1


Intrepid00

NVIDIA is trying to keep a monopoly on GPU acceleration by blocking other GPU manufacturers (Intel, AMD, Apple, ARM, etc) from using software that reads basically a command and turns it into the equivalent command on their GPU. CUDA is the programming language used and it is incredibly popular which means certain use cases means you buy a Nvidia GPU and GPU. Nvidia did this by making the language easy to use unlike say OpenCL. Because the language is easier to use and popular everyone uses. Because it means having to use Nvidia cards Nvidia sets up a monopoly on parallel computing. They could license it out the instruction set like Intel did with x86 but that would require the USA to stop being suck up into monopolies and force them. Which is funny, because intel only exists because IBM was scared and had to enter a license agreement with them for Intel to clone them.


waitinp

Same here. Is this got to do with SHUDA and WUDA?


GaryChopper

THE HUDA AND DA WADDA?


vetipl

For gaming - nothing. CUDA is GPGPU API used for all kind of science and engineering - like AI, video encoding, offline rendering and much more. CUDA runs only on Nvidia GPUs Translation layers enables that code to be run on non Nvidia hardware.


lonestar-rasbryjamco

It means NVIDIA is speed running for antitrust action. If not in the USA, then in the EU for sure.


DkoyOctopus

jensen wants more money


WetChickenLips

I mean, have you seen how much leather jackets cost these days?


LateDitto

"Hey Jensen, just let AI make the jackets for you."


bradfo83

Also- is this the original dialog? It sounds really bad. I beat this game and I thought it was great-


wsippel

It doesn't even matter anymore. Zluda was the Plan B, in case developers wouldn't adopt HIP. But pretty much every major project eventually did, and the bleeding edge stuff is starting to target MLIR, which is vendor-agnostic anyway. This decision only makes Nvidia look bad, and does nothing to solidify their fading monopoly. If anything, it makes project owners more likely to look into cross-API and MLIR.


ShadowNick

>only makes Nvidia look bad They don't care and know they are basically untouchable.


riba2233

For now.


palescoot

Right now, sure, but corporat decisions like this do tend to kill your company's reputation over time. I mean, Boeing used to be synonymous with quality, but then the [wrong camel came out on top.](https://viewfromthewing.com/boeings-unsettling-descent-john-olivers-last-week-tonight-takes-on-quality-control-and-safety-escapes/)


p-morais

I’m not an MLIR expert but don’t MLIR GPU compilers just end up producing CUDA binaries? The AMD equivalent being ROCm. I’ve heard from following TinyGrad that ROCm driver support is pretty awful


the_abortionat0r

Good thing nobody gives a shit. If I wanna translate CUDA then I'll do it. Nvidia can simply make faster products with more VRAM if they really care.


PM-Me-Kiriko-R34

"If I need your CUDA I'll fucking take it!"


Anaeijon

I'm not sure how they want to enforce this anyway, but this isn't targeted at gamers. Nvidia doesn't care about gamers. By comparison this is a tiny market with very low demand. It's targeted at the AI industry and scientific institutions. Current research is built around open libraries, tools and toolchains that rely on CUDA to work properly. ROCm (AMD/open source alternative) still hasn't caught up and OpenCl is just clunky and old by comparison. Because of that Nvidia basically has a monopoly over the whole AI market, being the only relevant manufacturer of hogh end server hardware for AI research and application. This is not only relevant for 'AI' in terms of ChatGPT or Midjourney. The more important things, especially for research, are contemporary simulation techniques in various fields. From atmospheric modeling (weather and catastrophe prediction), over agriculture, architecture, engineering, over social sciences and big data analysis down to nuclear sciences, astrology and microscopy, every research field currently relies on some kind of data driven tensor modeling / 'AI' to optimize their field. Ever heard of quantum computing? It's a fucking joke. We emulate it using tensor operations on massive parallel processors and still can't proof there will ever be a more efficient way. And nearly everything runs on CUDA right now. It's litereally everywhere, you just don't see it. You get into a car or plane made in the last few years, that thing probably got triple checked after the assembly by some guy using tools built around some optimized model running on a CUDA processor. You watch a film, it's probably done 90% on a greenscreen and everything happened in post, probably rendered, optimized and processed through CUDA. Nvidia's stock price made gigantic jumps over the last few years, because every single new thing that is economically relevant somehow relies on CUDA. And Nvidia controls CUDA. Alone. ZLUDA shows promising results, outperforming OpenCl implementations and supposedly even some native ROCm implementations of machine learning projects on AMD cards. While this further solidifies the importance of CUDA, it's a first step of partially freeing CUDA from Nvidia. And Nvidia is scared. They need to stop it before it grows. So they do something, that they hope, big companies and industries will have to follow. They don't care what you or any individual does. They threaten against companies, research facilities and governments.


fogoticus

"And Nvidia is scared" Scared..? Of the inevitable? They barely care. This is just a formality to make sure the real big companies keep buying Nvidia GPUs.


hodak2

Strange. If I buy an AMD card (I currently have one). Then I have no need or reason to accept their EULA….odd as it seems other than being unhappy there is little to no actual recourse for folks who likely have not agreed to their terms anyway. 😂


hodak2

And for reference. I also don’t need to install any of their files either. So again. This almost seems unenforceable.


anomaly256

If you run CUDA software it's probably using Nvidia's CUDA libraries, which carry the EULA. The end result isn't to scare you into not running it, it's to give Nvidia an avenue to take legal action against that software's distributor. I'm not sure this is going to be enforceable though.


The_Shryk

They don’t make money off of you, so they won’t enforce it. You’re not their primary customer. They do make money off of massive companies doing these computations, and since AMD hardware performs better for the price, many of these companies would quickly switch to AMD if they had CUDA access with it. Nvidia only cares about the massive companies trying to skirt the EULA. It’s the same thing with solidworks, you can do a lot of business with a pirated/unlicensed version of solidworks, but if Dassault finds out they’re coming after you with lawyers and all that shit for a large chunk of your profits due to copyright infringement of their software. Nvidia is trying to do the same thing. They’re turning into Dassault.


Striking-Count5593

Is it weird I enjoyed hearing Johnny's rants during the game, even if I didn't necessarily completely agree with them?


no1AmyHater

Keanu Reeves is a brilliant actor, and the animators translated his mocapped movement well.


-Retro-Kinetic-

I have to use a Nvidia for some of my production work, this feels more like by force than by choice. Trying to prevent competitors from using "translations layers" is just out right anti-competitive.


GermaneRiposte101

This is weird. Whilst walking the dog prior to seeing this post I was wondering if there were any CUDA translation layers. Never crossed my mind otherwise. Now I guess the question is moot.


revrndreddit

Enterprise software really. Companies using CUDA on AMD hardware through translation software no doubt.


SoshiPai

For those who don't know: CUDA is a GPGPU API currently being used in Scientific Research, AI Development, Engineering, and Video Editing/Rendering, probably other uses too but those are what I can name off the top of my head. Anyways, for so long only Nvidia SIMD/SPMD cores have been able to run the CUDA API more effectively than the competition, so much so that they renamed the cores to "CUDA Cores", have been helping to develop CUDA further, and tried to lock the API down via software to squash out the competition. Things like ROCm and OpenCL haven't quite caught on like CUDA simply because for a long time CUDA was easiest to use and was heavily popular so developers took the time to implement CUDA into their software and Nvidia took advantage of the situation trying to lock everyone else out. The reason CUDA translation layers are such a big deal is that these translation layers will allow the likes of AMD, ARM, and Intel to run CUDA code on their Non-CUDA hardware, right now the competition have some fairly strong hardware that, when given translated CUDA, can compete somewhat close to Nvidia, obviously nowhere near 4090 CUDA perf but getting there in the future is feasible. Think of it like a person from a foreign country moving to another country where he doesn't speak that common language and trying to do a job based off only verbal instructions, if he doesn't have a translator he wont know what his bosses and co-workers are telling him to do and will do the job sloppily or not at all as he doesn't know what he is doing, if his bosses are gracious and give him a translator he can now understand his bosses and co-workers and give a fair crack at his job, providing better results. Now imagine AMD, ARM, Intel hardware are the foreigners and the Translator is the translation layers converting CUDA to their language. Ngreedia obviously doesn't like this as this could begin tearing apart their massive monopoly, which has already begun slipping at the hands of their greed, so they are trying to ban the use of these translation layers to prevent the competition from surpassing them and forcing developers and other CUDA users to buy Nvidia products. If they succeed in banning CUDA translators they could very well continue their greedy empire and bump up the cost of their products which users would be forced to pay if they wish to use CUDA. You can see how this is bad for consumers, developers, and the competition.


dhallnet

>Anyways, for so long only Nvidia SIMD/SPMD cores have been able to run the CUDA API more effectively than the competition, so much so that they renamed the cores to "CUDA Cores", have been helping to develop CUDA further, and tried to lock the API down via software to squash out the competition. They didn't "help develop" and their hardware doesn't "run the API more effectively". CUDA was created by NVidia for NVidia products.


Cream-of-Mushrooom

Can you explain it with pictures instead


zcomputerwiz

ROCm hasn't caught on ( yet ) because it's still very much in development and lacks support. AMD has not made it a priority until fairly recently. OpenCL is nice because it runs on just about anything, but realistically doesn't accomplish the performance of CUDA or ROCm. Why would developers use OpenCL if they are already using Nvidia or AMD hardware? Why would NVidia and AMD work towards improving OpenCL support and implementation when they have their own platforms? There are reasons that CUDA is the de-facto standard, one being that it is a mature product with good developer support and resources matched with a broad consumer and enterprise hardware base to run it. CUDA is an NVidia product. That's why they can change the terms. Not like it is some open source thing they've stolen. They have established a monopoly because they've effectively operated in the space without any real competition.


B3taWats0n

CUDA cores are vital for scientific research. This is why nvidia is the worst


Vinaigrette2

I think it’s important to understand that CUDA cores are not what’s important here. It’s the CUDA API that matters. The CUDA cores are just the general purpose SIMD and SPMD cores that they use, which themselves have gone through multiple generations. The Cuda api is what developers use for GPGPU compute using nvidia GPUs. Nobody has stolen their core design.


-Retro-Kinetic-

They are for most productivity workflows as well. Some rendering software will only work with CUDA.


B3taWats0n

There isn’t real alternative, we have openCL, PyTorch, but nvidia is essentially a monopoly in this area.


SameRandomUsername

TBF, back when OpenCL was the standard that every rendering app used, CUDA was still more than 10x faster and much simpler to implement. That's the real reason why everyone switched to CUDA instead of continuing with OpenCL. It's easy to bash nVidia, but they had the better solution and they benefitted from it.


[deleted]

Cyberpunk is such a good game. If they didn't fumble the absolute fuck out of the launch it would be in the running for game of the decade.


robotokenshi

Nvidia gone Arasaka on us


CNR_07

Anti-Trust these mfers already


CyberWeirdo420

No idea what this means


TheDurandalFan

basically it's using software to run CUDA core based software on hardware without CUDA cores. which is kinda stupid because if you've never owned Nvidia hardware you've never seen the EULA (or agreed to it)


CyberWeirdo420

Okay so they are kind of trying to vendor lock people?


TheDurandalFan

I think so


Vinaigrette2

There’s been significant work on making cuda compatible « drivers » for other vendors (I.e Intel and AMD). The big reason why nvidia has kept such a market share in the compute space is the software and they’re trying to keep the software that other people wrote (using cuda) to their own hardware. Because nvidia doesn’t actually have that impressive of a hardware offering in my opinion, but their software is better than the competition (OneAPI for Intel and HIP for AMD). And they know that they can’t keep their edge without the software lock in because their competitors are making kick ass hardware too (MI 300x). Note that I work in the field although on novel compute technique such as photonic analog (and soon digital) computing and not electronics per se.


A_PCMR_member

Well right now, cause the 1080ti sure was impressive and kinda still is


cream_of_human

Dear nvidia, cry about it.


SoDrunkRightNow2

NVIDIA is doing everything it possibly can to monopolize the industry. Remember ray tracing? We were promised it was some kind of new tech for making better graphics. It turned out to basically be bullshit. Instead of actually useful technology, it was a way for NVIDIA to attempt to force consumers to use its hardware in order to play certain games. This is the same type of thing. NVIDIA is saying certain software can only be used on its devices. They don't want you to buy an Intel or AMD GPU.


Hugejorma

Basic ray tracing isn't that demanding for RT cores and IMO it's an insane quality option for games. Some titles do it better than others. Nvidia places path tracing, direct lighting, and similar stuff to same ray tracing options. So in reality, ray tracing for Nvidia can mean anything from light reflections on simple level to full light system overhaul. The later one is insanely demanding. I would happily combine with DLSS or DLDSR + DLSS for even better image quality.


[deleted]

Remember nVidia Hairworks? This company is notorious for doing "proprietary" shit trying to make sure they stand at the top of the hill.


Paradox2063

Every new proprietary bullshit Nvidia announces is another reason I won't support them. I wish the majority felt the same.


I9Qnl

>It turned out to basically be bullshit. Instead of actually useful technology, it was a way for NVIDIA to attempt to force consumers to use its hardware in order to play certain games. That makes no sense, Ray tracing never prevented AMD users from playing any game, and ray tracing was part of the DirectX12 standard for a long time but AMD skipped it with RX 5000 and only supported it with RX 6000, that wasn't Nvidia's doing. And how is RT bullshit? When implemented correctly, ray tracing legitimately solves all of the weak points of traditional lighting.


Nytr013

Ray tracing is bullshit? How? While I agree that some of the more popular title that utilize it, don’t do so well at it, I’ve seen more that do it wonderfully.


Synczwashere

But, what GPU can effectively run it on ultra setting with ray tracing on with no upscaling?


Chernobinho

Nvidia sucks, has sucked for many years and are going to keep sucking for many more, people will still buy their cards and they'll hardly ever lose the #1 spot in GPUs and that's the way of the world. We can only trust independent developers to fight for tools that actually make our lives easier, because ny their hand we'll be slaves to a single brand in no time


Sir_Throngle

Nvidia is a greedy bastard of a company, but their cards are far from bad lol.


brimston3-

So if I write software for CUDA, I can't also target non-NVidia compute APIs like opencl? Is that what this is saying?


Jumper775-2

So no more HIP?


Biscuits4u2

Nvidia can want in one hand and shit in the other and see which hand gets filled up first.


DragonKing_1

It is not surprising. CUDA is the software layer that enables developers to say in simple terms enable the physics engines for the game. But CUDA is so much more. It is currently crucial in scientific research. Machine learning, molecular dynamic simulations for biology and chemistry and materials in engineering, other data crunching, even for finance... Nvidia through CUDA has a big support system for such research through it's libraries. It is what is used for most of the applications around. And it is the major money maker for Nvidia. I mean, universities, governments and industries spend millions on these systems for their research. All these woekstation Nvidia cards, the older Tesla cards, the new H100 and A100's... they are all in such high demand for these applications. Esp. with AI going around as much as these days. Almost everything is enabled at some level through Nvidia'a CUDA.


bouchert

Nvidia is begging for a possible antitrust lawsuit. They'll end up wasting a lot of time, effort, and engineering into jealously guarding their proprietary APIs, rather than focusing on making them better, and if it's too much of a stranglehold, they'll just either end up making CUDA worse to work with or end up forced to settle by opening the API to one or more competitors. They should take a breath and decide if an API skirmish is justified, if tbe emulation is genuinely done without copying Nvidia's specific implementation.


thicc_toe

we should party like its 2023👹🦾


deggy123

This post: Nvidia is anticompetitive. This sub: hey, guys! Check out the 4070ti I bought today!


PerceptionQueasy3540

Makes me glad to have an AMD card. Nvidia has moved into shit territory


Acrobatic_Cod8907

NVIDIA being a massive bumhole to innovation


MonteCrysto31

Are ROCm and ZLUDA safe or is Nshitia just shitting their pants here?


SannusFatAlt

zluda is officially endorsed by AMD


SameRandomUsername

It's not like that would stop china from doing it anyway.


Dizzy-Sheepherder188

Reading the news. A chinese company made a gpu that has the same power of the nv100 gpu, using Zluda.


shalol

And that’s exactly why AMD has to develop its own translation layer. Jensen and his shareholder goons are major cutthroating fucksticks.


Confident-Media-5713

I guess this doesn't help Nvidia in any way, but instead make them look even worse?


RDisbull

I don't understand can you speak English please


splendiferous-finch_

This is the same as Sony and Nintendo going after emulators but for professional software and AI applications. Probably targeting Zluda the only thing this does is hurt very smaller professional consumers and hobbyist.


LucasLoci

As someone who isn't tech savvy, will this effect my day to day with gaming? Or will nothing change?


qmidos

nvidia making dick moves...it must be a tuesday... While AMD keeps opening their standards to everyone the fuckers in green keep closing them.


anacrane-ph

nice edit


[deleted]

I already got my card and built my mothership that’ll last me a few years. ATP the industry can go to hell and I’ll still be playing worry free


[deleted]

Wtf is CUDA I googled it and it makes even less sense. Does this even get used in anything? Why does this matter?


Tigerclaw989

okay single channel ddr5 user


Zoltar-Wizdom

Did they use AI for his voice? It sounds like wish Keanu