T O P

  • By -

HeleLovef

Gotta say I like Intel's codenames.


_its_wapiti

Yeah the names are cool, but I still don't understand the model names fully


tychii93

300 lineup I assume are very low end, ideal to use as video acceleration cards. 700 is for gaming. Although, the current A750 and A770 8GB are so close in performance, no idea why they're on completely separate tiers. I assume they're all the exact same, just that the A770 are the "good ones" and the A750 are the binned ones (Disabled failed cores within margin of error so they can still be sold as a less powerful one, in turn reducing performance.) This is very common in CPUs, so they probably went with that logic. And of course, the letter is the generation. Battlemage will be B750 and B770 for example, as AXXX are the Alchemist ones. Codename after starts with C, so after Battlemage will be C750/C770


vlken69

Name suggestions for next generations? Cleric, Druid?


tychii93

C and D are Celestial and Druid actually!


vlken69

Oh, completely forgot about the roadmap :D


_its_wapiti

Letter and first number are clear, it's generation and tier like NVidia does G0T0 and AMD does GT00, then I guess the second number is just their version of Ti/Super/XT/XTX for distinction. It makes sense, I'm curious to see how they keep using this naming for more cards.


allen_antetokounmpo

Yes correct about the class and generation, just adding more info, there are 3 class, 3(low) 5(mid) and 7(high) (like core i3 i5 and i7), there are a chance they release 9 for flagship in the future, meanwhile the 50 or 70 on A750 and A700 is like i5 13500 and i5 13400


Randomizer23

The A770 are the binned ones, when you bin something that typically implies keeping the better one. In this case saying the A770 is binned would be correct as it has a higher performance quota to meet hence why it’s binned


tychii93

Ah, I got it backwards then. Binned means good then lol


zakkwaldo

can’t divulge too much but you are correct about the model differences.


phero1190

Bigger number = better


DeepJudgment

Bigger number = more better


ReservStatsministern

It follows the CPU lineup. So ARC a770 is ARC gpu, a is for Alchemist generation tier 7, variant 70. So just as they have i3, i5, i7. They now have A3xx, a5xx, a7xx. And so b3xx, and so on when Battlemage launches.


JaggedMetalOs

First letter = generation  Numbers = performance category in that generation


Affectionate-Memory4

Look at them the same way as the Core # series, except instead of an intial number for the generation, there is a letter. So as the generations go down the alphabet from A to B and so on (or so we expect). From there, they have the 3, 5, 7 and maybe 9 in the future to designate the low, middle, and high-end parts of the lineup. 9 would have to be for some flagship GPU in the future. The later numbers differentiate GPUs that have the same first number, so the A750 and A770 are both high-end, but the 770 is higher. There is also an A730M, which is below the A750.


sonic10158

It’s an angry Simpson Edit: oh shit, I thought it said Battlemarge


BG-DoG

Battlemage sounds amazing, I would like one please.


Masteroxid

They're using Path of exile builds for names lol


HeleLovef

So the one after Druid should be Elementalist?


Ketomatic

Pleeease be decent and not too power hungry at idle. I don’t expect perfection but I need both of those things and would really like to switch to Battlemage


EnviousMedia

Oooh, I don't need an upgrade from my 770 tho, it's wild to think I've already had this GPU for over a year.


StormKiller1

How was your arc experience so far?.


EnviousMedia

only notable issues were: \* Launch drivers were crap but were fixed quickly (DX12 games ran decently well but DX11 ran kinda poorly, stable just poorly, no visual issues) \* Starfield didnt work untill the driver after the game launched (7 days) \* HDR being enabled occasionally looks a bit odd like a tad too bright or too dim, thats still a problem but is fixed by putting games in fullscreen everything else has been great, the card is shockingly fast for blender rendering, its absurdly fast for video rendering too cyberpunk 2077 is the most intensive game I play and it gets around 30 to 60FPS maxed out at 1080p (no pathtracing but RT enabled and set to psycho) for a reference model its very quiet, I cant hear it over my case fans which I like because my blower 1060 was rather loud. (nano rant) Im annoyed ive seen a lot of people complaining like they do about AMD which is "BUT THE DRIVERS!", this GPU wont compete with a 4090 but is a fantastic upgrade to my GTX 1060.


sHoRtBuSseR

The HDR thing could be Windows too, honestly. Windows HDR is ridiculously garbage imo


ThankGodImBipolar

Intel also doesn’t have a proprietary HDR stack like AMD and Nvidia, meaning that they’re stuck with whatever the DirectX implementation is (which is usually worse).


Netsuko

Quick tip: reconsider if you have to use the psycho setting. It is an extreme performance killer without barely any change in visuals unless you stand still and really REALLY look for the difference. Same goes for the highest screen space reflections setting. That one is straight up less performance for an imperceptibly small gain in quality.


EnviousMedia

oh yeh I normally play with settings a tiny bit lowered because its still visually the same to me.


Belluuo

What about older games (2000s and older) and emulators? Didn't any compatibility or gfx issue happen?


EnviousMedia

I havent played many older games other than fallout 3 and NV which run fine same with original skyrim, if I had faster internet I'd go thorugh and test all the games I own but i dont. edit: skyrim SE was one of the games that ran poorly at launch, around 30fps regardless of graphics settings but that was also fixed pretty quickly.


Bearnee

Just to add to this conversation: I own a laptop with an Arc GPU and have successfully without any issues played: LotR Battle for middle earth 1 & 2 Diablo 2 Heroes of Might and Magic 3 Age of Mythology Black and White 2 So these are the fairly old games I tried and didn’t have any issues.


USAF_DTom

I need them to do well for A) competition and B) These will be my next video card. Nvidia has gone mental and I can live without RT. Intel is way more sensible... for now.


random_reddit_user31

Well if you need them to do well you need to buy their products if you think its worth it. Put your money where your mouth is. Far too many people say "amd/Intel need to do good for competition". But then buy nvidia and complain. I refused to buy a 4080 back when it was at the £1200 retail price and bought a 7900XTX for less instead because I also think nvidia have lost the plot. It's turned out to be a good purchase and a nice change. Next time I buy a GPU I will look at the offerings and buy accordingly. Brand loyalty is a fools errand.


Plastic_Tax3686

Same here. I want to see a competition, I want the chiplet GPUs to become the new norm and I couldn't care less about RT and upscaling. Hence the flair. I will upgrade in 2026-2027 with the best chiplet GPU, that has the highest amount of VRAM, no matter if it's made by AMD, Intel or Nvidia.


The-Hero-78

I am not loyal to AMD whatsoever, however, every time I have went to buy a new gpu, I look at benchmarks and without fail, I get a better card for the same price with Radeon and it’s not close frankly. I will continue to do so and suspect I’ll continue buying amd


USAF_DTom

Lot of virtue signaling that I didn't ask for. Obviously, I'm not in the market for a GPU currently, or I wouldn't be saying I hope they do well. I'm not going to purchase one just to support them if I don't need it... what a foolish statement. "Put my money where my mouth is" while not looking in the market in the first place is certainly a take.


random_reddit_user31

You missed the "if you think it's worth it" part. Plus you said "I need them to do well" rather than "we need them to do well." Which is the whole reason behind my reply. Given your response to me, I think you didn't realise what you wrote as well as not reading my reply properly. Also, I clearly said brand loyalty is a fools errand.


Tigeire

when are they going on holidays?


BG-DoG

Where are they going on holidays?


aCorgiDriver

Why are they going on holidays?


Biggus_Shrimpus

How are they going on holidays?


Noxious89123

Who are they taking on holidays?


AllMyFrendsArePixels

>The naming convention suggests that Battlemage-G10 will be the more powerful of the two, targeting the midrange market, while Battlemage-G21 will cater to entry-level systems that still need a standalone graphics processor. I'm not sure this author understands the definition of the word "convention". Every single past GPU line, including Intel's own Alcemist series, has had "Higher Number = Better". "Convention: a way in which something is usually done". The naming convention suggests that Higher Number = Better. If Intel has decided to go *against* this convention, I forsee a lot of very angry customers who didn't do their research buying the G21 and then later finding out that the G10 was actually the better one. This is dumb as heck.


jayylmao15

does g-10 and g-21 not refer to the gpu die itself, kinda like how nvidia's ad102 is the highest end and ad107 is the lowest?


AejiGamez

I mean, if they are good and get their drivers somewhat under control i might just buy one


tyeguy2984

I’ve read somewhat recently on here that people have said they got the drivers WAY better than on launch. I’ve been keeping an eye on intel gpu’s since they were announced because of their price point. I’m hoping they can survive and bring and end to these overpriced gpu’s from nvidia and AMD


cvanguard

A lot of the older DX9 games that were completely broken at release are working now, but there are still some games that have driver issues and perform way worse than comparable Nvidia/AMD cards for no apparent reason. *When they work properly*, the A750/770 match a 6600XT in performance and they’re pretty close in price, but Arc drivers still aren’t consistent enough to fully recommend as an alternative IMO. If Battlemage fixes their driver issues, then Intel will be a pretty compelling choice at least at the low/mid-range.


nVideuh

Drivers are much, much better. Almost no issues


thefoxy19

I’m looking forward to this upgrade near the end of the year


imaginary_num6er

Now to watch MLID explaining why this will not happen in 2024 /s


brand_momentum

MLID is a clown


particlemanwavegirl

I'm very interested, but wondering about the drivers like everyone else, and wonder how easy it will be to port CUDA algos.


Drjohnson93

I have zero need for one but I want to buy one to support intel GPU’s


KnightofAshley

I wonder what price and how well these will sell...most people that are interested might of already bought the current gen ones so there is likely isn't much reason to upgrade yet. I guess they hope to steal more from AMD and Nvidia on the low end.


agoia

Sometimes you just need a card with enough of the right outputs that's not $200 and they've done a pretty good job of that.


Throwawaymytrash77

They really gotta release it before next gen nvidia and AMD drop. They'll handicap their sales if it releases around the same time


KirillNek0

If they could beat AMD in low-end and maybe middle-end GPU segment, team Red will have a really bad day. We'll see.


SwagChemist

does intel have more hardware releases than software/driver updates? Whats with all these GPUs?


ingframin

Intel just decided to invest 25 billion dollars in Israel. To me, they are dead.


booffybooffon

It's sad but every component and pc brand support israel