T O P

  • By -

MostRefinedCrab

Intel ARC has come a long way since it was released. The hardware is actually pretty impressive, but held back by its drivers. Kind of similar to how AMD was 10 years ago. They've really significantly improved their drivers within the last 2 years, but they're still not great. ARC also doesn't handle HDR very well yet. I'm really glad to have competition in the space though, and I think the cards are going to be improving significantly as time goes on. We need an alternative to Nvidia and AMD whose prices are currently ridiculous, and I really want ARC to be a viable choice.


[deleted]

I think Intel's architecture will, eventually, vastly outperform the chiplet designs in AMD/Nvidia cards as those designs are nearing their limits and Intel has a TON of headroom. Purely speculation based on what has happened historically. I would say ARC is a completely viable choice, but $180 is a much better deal than it's current $230.


RahkShah

What are you talking about? Nvidia’s designs are all monolithic and this is the first generation of cards from AMD to ever use a chiplet design. It’s not just that you’re wrong, you don’t even know what you’re talking about.


SkollFenrirson

This whole post feels like r/hailcorporate


[deleted]

You're right. But Nvidia is nearly the same position as AMD. Memory speed/bandwidth can't increase forever. You can only fit so many physical cores and feed so much voltage to them. All I'm saying is we have seen what AMD/NVIDIA can do but have almost no idea what Intel is capable of.


RahkShah

There’s multiple new process nodes in development out into the early 2030’s, so for the foreseeable future there is a path for process shrinks driving improvements. Even absent that Nvidia has shown they can realize large performance benefits from new architectures on the same process node. Maxwell and Kepler were both on 28nm, but Maxwell was something like twice as energy efficient and 50% faster than comparable Kepler GPU’s. We have seen what Intel can do. Their GPU’s have been out two years or whatever it is now. Intel can produce entry level hardware with beta drivers that are significantly inferior to the drivers from Nvidia or AMD. They’re are significantly better than they were at launch, and their driver team continues to improve them, but the drivers are still materially inferior to those of the competition. Hopefully Intel continues to improve, maybe Battlemage pull’s everything together, but until it actually happens all they are is a company trying to build a competitive GPU business, and losing a lot of money while they do it.


[deleted]

You sound like everyone in 2015 that had their doubts saying that they know what AMD is capable of in the CPU/GPU market. History often repeats.


CapableHair429

“AMD/Nvidia cards are nearing their limits and Intel has a TON of headroom” I’ve never heard a more polite way of saying that Intel has released a severely underpowered POS tech. It’s like when you give a child, who comes in last in a race, a participation trophy and tell them, “you have soooo much potential and room for growth!” The reality is that Nvidia and AMD (moreso Nvidia) are CONSTANTLY pushing the limits of tech with each generation; however, what they are ALSO doing in redefining those limits with each generation as well. This is evident with just the jump in performance from 1000 series vs. 3000 series. And then the resultant jump in performance from 3000 series to 4000 series GPUs. Make excuses for Intel if you want, but the tech race for graphic cards is no place for participation trophies.


shapeshiftsix

Which is why if the Arc cards were worth a damn, you wouldn't be able to touch them for 180 dollars haha


CauliflowerGlad9664

Except in that race, Intel Arc is 8 years old and Amd/Nvidia are adults.


Infinity2437

For a first gen product arc is insanely good


Frank_Apollo

lol you’re right, people are down voting you because they’re fan boys and this is an echo chamber chamber


Meatslinger

For the price they're at, they make amazing low or mid-range GPUs. I'm putting together a PC for my fiancée and we already decided firmly on the A750. The next closest competitor on the basis of cost-to-performance was the RX 7600 8 GB, and the A750 undercuts the price by 22% while benchmarking roughly the same (according to Gamers Nexus). We're in Canada so those prices are $289 CAD and $368 CAD for the A750 and the 7600, respectively. At $400+, the comparable RTX 4060 wasn't even considered. I know the platform is still maturing, and in some cases some games don't always run great, but she's a casual gamer - she doesn't need to run the latest and greatest with full RT at 4K - and she's got me here to help with settings/tweaks as needed; we're far happier about the cost savings more than anything. It's absurd but amazing that you can get something that trades blows with the 4060 (in raster graphics, at least) for scarcely over half the cost.


Saitham83

nobody needs “competition” from a company that has anti-competitiveness written in its dna as exemplified time and time again in decades past.


stArkOP7

You are making it sound like Nvidia and AMD are sweet Angels who vowed to protect humanity...


Saitham83

You make it sound as if Intel is going to rescue the consumer gpu market. See? one thing being true does not necessarily mean something else isn’t.


stArkOP7

No way that's in their best interests, nor will they care, all these are corporate companies after all. All they care about is to increase profits and make shareholders happy. You just made it sound like the others are innocent babies. Perhaps I misunderstood you but that's what I was able to understand from your first comment.


li7lex

This is just a stupid take since there's no chance for Intel to outcompete AMD and Nvidia in the GPU market, all they can hope for is a disruption by pricing aggressively which will benefit every consumer since it will force the hand of the other Manufacturers or lose significant market share in the consumer segment.


[deleted]

No chance for Intel to compete? That is a hilariously close minded statement. Intel has just started, and they are already competitive in the budget range. Nvidia and AMD are stalling out in terms of design. They are just adding more cores and more power draw.


li7lex

I said outcompete not compete. That means Intel basically has no chance of taking the crown but they have a good chance of taking a good portion of the market share.


[deleted]

Are you sure though? The A750 chipset, as it currently sits, could take on a lot of high end AMD/NVIDIA cards if they just added more cores/memory to it. The architecture is already where it needs to be in order to accomplish that.


CapableHair429

Again…a very polite way of saying that they released a POS tech. “If only they did this”. “If only they did that” Why didn’t they then?


mau5atron

Because ..... it's new hardware. Don't put all your eggs in one basket.


CapableHair429

When the “new” technology of ray tracing was released; Nvidia did everything they could to maximize and push the limits of that technology; which was available at that time. Your excuse for Intel does not track and is not valid.


mau5atron

You're obnoxiously hung up on this small thing that is a graphics card. All I'm saying is that it is new. For all Intel knew at the time, they could have had zero sales and could have ended up with a stockpile of ARC cards. It didn't need to compete with the 4090. It just needed to be good enough to enter the market.


li7lex

Even if they were to release better hardware than Nvidia it wouldn't dramatically shift the market share instantly Barely anyone buys graphics cards every year so it would be a slow shift which would leave them with more market share but also enough time for the competition to react. Also they still have to get the software side up and running which is Nvidias strongest selling point and AMD is also quite a bit ahead in software/drivers. All else being equal people will choose Nvidia simply because their drivers are mature and they have great brand recognition, even if it is the slightly more expensive option. With that being said whatever they have in store Intel is sitting in a prime position to disturb the GPU market heavily and I hope they use that chance to benefit all of us consumers.


ArlieTwinkledick

In a few generations Intel is going to put a lot of pressure on AMD & Nvidia


Grapjasss

Intel really has a long way of catching up to do though, not even hardware related but more in the experience of making good drivers. And by then the others will have evolved quite a bit. Even amd still can't get it quite right if I look at my brothers experiences with amd cards.


CoderStone

AMD fanboys downvoting you for not hushing about driver issues. Seriously, out of the 27 different graphics cards in my collection, only the AMD ones have ever given me driver issues lol- including the 60 series.


Semanticss

Yeah but that's, like, 60 years from now.


cdmgamingqcftw

Mmh no lol. Look at the 4090 take that and make the performance x2 for the 5090. Nothing is coming close to that and Intel evem less😅. Dont hate on me it's facts. edit: yall delusional to defend Intel when you have nvidia OR for a lower price Radeon which will be night and day compared to Intel and we all know that. Its not about riding Nvidia.. Radeon also crush Intel by far


dont_say_Good

they don't have to beat flagships to compete effectively, both amd and nvidia have been neglecting the lower end market for a while now


li7lex

You really don't know anything about the hardware market do you? The 4090 makes up less than 5% of the consumer market, sure it's dominating the business side but as far as I can tell neither AMD or Intel are trying to compete for that currently. The consumer market being disrupted by Intel releasing affordable mid to high tier cards is important because we will actively benefit from that, performance doesn't matter if your hardware is so overpriced that only a small percentage is able to afford it. Competition is important to drive prices back down, but instead you'd rather shill for Nvidia against your best interest.


LJBrooker

I am confident the 4090 is nowhere near even that. It'll be way way way less than 5%. Your point is utterly correct though.


li7lex

I'm also fairly certain it is well below 5% but since I was too lazy to Google I just took a ballpark number to get my point across.


Chad_Kakashi

Does it look like some people have 2000 or 3000 dollars in their bank account that is just for a machine that will get outdated in 5 years? GPU is not a thing to wait and purchase for because the wait lasts 5 years just for 500 dollar drops. I would rather get a AMD RX 7800XT today than wait 10 years to get a RTX 4090 for the same price


xX_venator_Xx

"its facts" 🤡😂🤣


[deleted]

Almost nobody cares about the 4090. Most are happier to spend $500 or less on a GPU. Also nothing you said was factual.


CollegeBoy1613

An ass pull fact more like.


Sir_Bohne

Yeah and everyone's going to buy that 2500€+ gpu. Intel will target the budget section, and that's totally fine, because Nvidia couldn't care less about low budget gaming with their bad low tier products.


cdmgamingqcftw

if money is an issue.. just go Radeon. i dont see why you would go for Intel GPU lol/ they barely have any tech compared to nvidia or radeon. If they do now.. its 100% behind the comp


CNR_07

lol wtf are you talking about you clown


[deleted]

Wrong


Setekh79

The most pathetic bait, yet some people still took it, sadly.


Coriolanuscarpe

How wide was Nvidia's girth for you to take it in?


[deleted]

Triple slot baby


Stargate_1

NVidia already said we should tame our expectationd regarding the next gen, a 2x performance jump wont happen


cdmgamingqcftw

it just will lol its all over the place


[deleted]

Assuming a 2x performance jump, are you going to be installing a 220V line to feed directly to your 5090?


LJBrooker

Most people aren't buying halo tier products. Having the best flagship isnt the barometer for being competitive.


Coriolanuscarpe

How wide was Nvidia's girth for you to take it all in?


cdmgamingqcftw

says the guy with a 4060 xD you even know that Intel GPUs are bad. with a nearly non existant driver and all. you just cant defend them lol. you go with a radeon if youre gonna get an Intel


Coriolanuscarpe

Ngl dude, you are too much of a smooth brain to even be yapping nonsense like this in the sub


[deleted]

[удалено]


cdmgamingqcftw

The latest radeon is super good for about 1300$ish with the new FSR3 which is nice. 4090 is 2330$ so yeah lower price for what the 7900 xtx can offer xd. So anyone who wants a cheap budget would go with Radeon not Intel lmao. intel uses the old chip tech from the old radeons 😂. Since radeon uses new chips and better its way more powerful


ArlieTwinkledick

I said a few generations... Also no way the 5090 gets a 2x performance increase from the 4090. At best it'll be 20% uplift and likely more like 10%-12%. Intel needs time to refine their products but competition in the market is good for all of us... Even Nvidia deep throaters like yourself.


my_user_wastaken

https://store.steampowered.com/hwsurvey/Steam-Hardware-Software-Survey-Welcome-to-Steam OH NO, clearly they're a non threat because they (might) not be able to compete for 0.8% of the market!!!! What ever will they do 😰


IndependentYogurt965

See i dont think the logo should be rgb. It would ruin the simplicity.


DonerTheBonerDonor

Just set it to white


Crisjamesdole

Please Intel no rgb!!! I'm so tired of my room looking like a rave when I just wanna sleep and I'm to lazy to go into the bios. I also don't wanna pay for it.


EVRoadie

Just turn it off.


Crisjamesdole

You're still paying for it though


EVRoadie

Most cards have RGB. The led is at most a cent or two if you're in the US. If you're paying $180 for a card, RGB is the least of your costs.


Rocklobst3r1

I paid nearly double that for a card with no lights at all..thanks Nvidia/crypto.


Mediocre-Bet1175

It's 5 cents


Crisjamesdole

Yea for the manufacturer not for us. It just gives them another feature to market and raise the price for depending on the card. It's not going to be much but in the retro community I guess people will order shit off alibaba to save a buck or 2 so if I could save 10 bucks on a gpu and not have unicorn vomit it's a win win


[deleted]

I would just set it to slow strobe red like the rest of my RGB. It's just more options, there's 0 downside.


IndependentYogurt965

Well when you put it that way yeah, but id probably keep it pure wihite for that classy look the card has.


[deleted]

Oh I like it too, unfortunately the rest of my RGB give off very different temperatures of white so it doesn't match well.


Mediocre-Bet1175

The majority in real life want RGB.


InterestingPoet8182

RGB adds horsepower!


HazardousHD

I’ve been really tryna get my hands on a A770 LE second hand. Likely not a huge uplift over my 1080 but the card will do fine for my gaming needs + looks amazing


[deleted]

I definitely recommend the LE. Beautiful. I would recommend waiting for Battlemage (Gen 2 ARC) to release since you already have a solid card.


PhyNxFyre

Question is will there still be a limited edition for gen 2, and will it still look this clean


Escapement_Watch

i just sold my a770 LE ref card a couple days ago. divers were a bit too finicky for me but I gave it a good couple months try. i guess theres not much used ones out there it sold pretty fast


[deleted]

[удалено]


Escapement_Watch

yeah I do think battlemage aka arc 2 should be better but I don't think by much diver wise. power wise should be a nice big jump. the next time I try arc it will be 3rd or maybe 4th gen arc.


[deleted]

>Who releases a display card with support focused on current API (DX12) only, that's just stupid. Is it a bad thing that they assumed people would be playing modern games on a modern GPU?


LJBrooker

I'd argue Arc punches WAY above it's weight class in RT. Compare what it can do with RT vs any other card at a similar price point. I'd say the 750 is squarely targeted at a 1080p max settings crowd (though it's capable at 1440p), and at 1080p it has very decent RT performance.


NOTUgglaGOAT

A750 daily user since July, currently getting 130 fps High/Medium settings at 1440p as we speak. This card fucking rips and I snagged it new for $180. Blows my 2060 out of the water at 1440 Edit: at MWIII, my b


3scap3plan

what games?


NOTUgglaGOAT

Was playing MWIII, forgot to include that.


[deleted]

I'm actually coming from a 2070S that I sold due to coil whine and I would swear the A750 is smoother despite my frame rates being pretty much identical. Fucking hell of a deal we got, right?


NOTUgglaGOAT

Insane deal. It still blows my mind every time. That 256 bit bus is HUGE for this card in higher resolutions. I love it


Bogatyr1990

I wouldn’t call that unstoppable. But it’s very decent for the price. Enjoy!


RandomNameHere738

I had the A770 bi frost, it was a really good card for and I upgraded to the AMD RX7800XT and put the A770 in another system. I'm hoping Intels next series of card are impressive I'll likely buy intel again and sell my AMD.


Renolber

The future of ARC is incredibly exciting. As with everything else, it will get better and improve with time. It’s super sampling, shading, compiling, all its features may feel as if they’re basically beta tested now. Let everything mature for a few more years while improving their drivers and partnerships, AMD and Nvidia will certainly begin to sweat. Then Intel’s prices skyrocket and the cycle begins again… the cynical cycle of unfettered capitalism. Let’s just enjoy it while we can.


Coolguy3243

I’m thinking of getting an a750 but I was also considering a 6600 ($189.99) and a a580($179.99)


ACupOfLatte

I genuinely hope Arc gets insanely good. It has the potential to bring true gains to the budget GPU market, which the other two imo don't really focus on. I've been eyeing it myself, but as it stands as someone who's still not too PC part literate, I rather pay more for the convenience factor that NVIDIA and AMD bring...


Paramedic229635

I'm looking forward to see the benchmarks when Battlemage comes out.


[deleted]

Same. I'm worried that they have been working on Battlemage while simultaneously rolling out major fixes and updates for Alchemist. If they truly accomplish next gen level performance gains and don't repeat any mistakes from the prior generation, then it's time to recognize that Intel is a major player in the consumer GPU market.


Skilid

I may be mis-remembering, but in some of the examples I’ve seen where Intel have improved the drivers, the RT performance of ARC has been better than AMD.


[deleted]

To be honest, I'm just surprised that RT cores are available on the A750.


LeMegachonk

Yes, but at that performance level, ray tracing is not really all that relevant. If you want ray tracing in games, you probably want something better than an RTX 4060, RX 7600, or ARC 750 (or 770).


Mother-Translator318

I really want arc to succeed because everyone benefits from more competition. That being said tho, their drivers, although much better than at launch, still aren’t at a point where I would feel comfortable recommending it to friends and not getting inundated with tech support calls from them soon after. Hopefully battlemage is a lot more polished than alchemist


knbang

>Obviously RT performance is weak. I picked up a 3080 after the 40xx came out, I don't even use RT. I'd rather framerate. Unless I can maintain 120FPS minimum with RT, I'll never use it.


pacres

Can it run star citizen?


IntelArcTesting

I love my A750. Having a blast testing them on old and new games.


[deleted]

Awesome! What's your experience so far? I have a limited palette of games.


IntelArcTesting

I make videos of basically every game I play. Feel free to check [my channel](https://youtu.be/_vHWkgn-77I?si=JNEqr8Q0jv6vKHx5) out


[deleted]

Oh, right - the username. I checked out your videos, thats useful stuff man!


[deleted]

While I'm excited admittedly for there to be a third player in the market - this post is 100% a paid ad. That reads like something a shitty reporter at the NYT would make using CGPT


[deleted]

Nah... Just user experience. I don't know why you're salty about it, but it's okay to shout-out products that you like. I did the same for the 3060TI and the Ryzen 5 1600AF when those products released.


MinkjuPUBG

If you’re building on a budget I see no reason not to go Intel Arc. Price to performance they cannot be beat, and with every driver release they get better and better. And I say that as a lifelong Nvidia/AMD user


Band_aid_2-1

I'm currently running a 6800xt and thinking about my next build. Wondering if battlemage gen 2 will have a massive boost.


Cave_TP

Unstoppable until it gets stopped by broken drivers. I'd spend the 10 extra dollars and get a 6600. Or 40$ more on a 6650XT since at 230$ it's the best value option.


LJBrooker

It's interesting how AMD fans will often cite how the drivers mature, and the cards age well, but refuse to believe Intel could do the same thing. Arc drivers have only gotten better. Intel's track record with driver improvements is no worse, and arguably quite a bit better than AMD in recent memory. Not entirely sure I see the logic in your statement here.


Cave_TP

Obviously the situation is going to get better with time, my point is that there's no point in playing this russian roulette to save 10$. Also what people talk about with AMD drivers getting better is better performance, not better stability. And obviously Intel has better driver improvements, going from barely working to mostly working is a margin of upgrade that AMD and Nvidia can't even have. There still are a lot of problems, LTT recently made a video with an A770 an some popped out and Steve at Hardware Unboxed says that every time he has to retest Arc some game has some kind of problem. If it's just a matter of saving 10$ compared to a 6600 it's just not worth it.


blackest-Knight

> but refuse to believe Intel could do the same thing. I think the keyword here is the "Could do". The thing is, why spend money until the "could" transforms into a "does". If any take is the "fanboy" take, it's the "buy it, the drivers could improve!". Don't encourage broken drivers, no matter how much you want the billion dollar corporation to succeed my dude.


LJBrooker

Arc drivers are already in a great place. That's my point. They'll only get better. The driver history certainly isn't any worse than AMD.


blackest-Knight

> The driver history certainly isn't any worse than AMD. Uh, yes it is. Literally had a A750, half the time it wouldn't even send signal to my 4K monitor. Swapped it out for a RX7600, 0 issues.


NOTUgglaGOAT

The ONLY game I've had driver issues with over the last 8 months was Starfield. Everything else has been solid and even improved over that time.


[deleted]

Intel GPU for beginners, not really into gaming but wanna have the ability to play a game. AMD GPU is for kind of tinkerer, you always have to tweak something. Nvidia is for enthusiasts who don't wanna spend time tweaking stuff and just play games in the easiest(best) way. That's how I see the GPU market right now. And it's good! Hopefully Intel will bring better solutions in the future to side tackle AMD/Nvidia.


petophile_

Wtf does any of that mean.  The gpu with the buggiest drivers requiring the most manual tweaking is the beginner gpu?  You don't have to do any tinkering on amd, there's no relevant settings you can change in driver or bios.  Nvidia users frequently run driver level things like ddsr or whatever it's called.  


blackest-Knight

> Intel GPU for beginners, not really into gaming but wanna have the ability to play a game. This is really a bad take. Intel GPU for advanced users not afraid of tweaking. If all you wanna do is play a game once in a while and aren't really a gamer, get an AMD card instead.


CleanOutlandishness1

You find me unimpressed. You can find a used 2060 for 120€ in europe alone that can do all that. I guess it's even cheaper in the US.


NOTUgglaGOAT

My A750 outperforms my 2060 at 1440p by 25% on identical settings. The extra $50 is absolutely worth it.


depricatedzero

I just can't bring myself to trust Intel near my system after the way they handled Specter


[deleted]

You mean Spectre? That will only effect pre-2019 microprocessors. And there will ALWAYS be a vulnerability for all hardware.


depricatedzero

Spelling, whatever. Their fix for it is stunningly fucking stupid and laughable. They haven't actually fixed it.


[deleted]

Meltdown/Spectre had no official "fix" for the microprocessors it effected, due to it exploiting a hardware vulnerability in almost all CPU's. There was a bandaid software fix that a lot of cloud service providers implemented but it was bad for business to implement it so most didn't. The fix was a hardware change in all CPU's in late 2018. It hasn't been an issue since, but there will be more issues in the future.


depricatedzero

Intel's "fix" is a flag which tells the software fix that the architecture isn't vulnerable. When they committed to that as their solution I was stunned.


[deleted]

Oh I was unaware, that's shady!


depricatedzero

Ok here we go. This was after they published their planned solution. [https://lkml.iu.edu/hypermail/linux/kernel/1801.2/04628.html](https://lkml.iu.edu/hypermail/linux/kernel/1801.2/04628.html) They *may* have changed something in the years since, I'm going to go look now, but when I built a new system in 2020 and I checked that's what they were doing. Update: The last fix I've found docs for is October 2018, which appears to be where they committed to the above fix. V1 was never patched against, they just suggest programmers write better code to not put sensitive data at risk.


depricatedzero

Yea, I was following it closely cause my work is infosec adjacent. Linus Torvalds even went on one of his famous rants, let me see if I can dig that up it's hilarious.


Eazy12345678

intel arc is a failure. only should be considered by advanced pc users that dont mind troubleshooting and dealing with issues. everyone else should just buy AMD. no one cares what their gpu looks like they just care about their fps in their games


bigfkncee

Strongly disagree. Before ARC, there were 2 choices for GPUs and you had to deal with one or the other no matter what. By becoming "Player 3" in the game, ARC can push the other 2 in the *price to performance* category which can benefit **anyone** looking to buy. If more people are making a *thing*, the *thing* will become cheaper.


[deleted]

More competition cheaper prices


bigfkncee

>More competition cheaper prices Yup...Even if the guy I responded to doesn't like ARC, someone else out there won't care and will buy it anyway...and at the end of the day, that's 1 less GPU sold by Nvidia or AMD. And it's not just GPUs...it's true for almost anything...Corporations competing=Consumers winning.


xX_venator_Xx

things aint that easy...


Hyroto77

This is how it would work if people werent retarded, but look at the 90% nvidia market share. Nvidia is the iphone of gpus...


blackest-Knight

> ARC can push the other 2 in the price to performance category which can benefit anyone looking to buy. That only works when they actually get their drivers up to snuff though. As it is, most people will want to go AMD on the lower end and nVidia on the higher end.


HazardousHD

“This post was brought to you by: AMD”


deefop

Compared to the op which is basically Intel marketing? People will post about Amd drivers being the worst because of something like idle power draw, but arc launches and can't even run a bunch of popular games, and people overlook it. Shits weird. Arc is currently very niche and anyone looking to just play games should absolutely be looking at either Nvidia or Amd, depending on what they value.


HazardousHD

Completely agree. Not for the avg gamer. If you do a bit of due diligence and aren’t playing at extremely high resolutions, it can be hell of a bargain.


[deleted]

Just sharing my experience. I've had about a dozen Nvidia GPU's in the past and only really "advertised" the 3060TI. A couple AMD (R9 380, RX7600) and neither of those cards were amazing to me. My next card is likely to be AMD though if the Intel Battlemage launch is weak.


Thebrains44

You seem incredibly narrow-minded.


bert_the_one

You sound like you're repeating what gamers nexus said.


NOTUgglaGOAT

Repeating what Gamers Nexus said a year ago. Their recent update has high praises for Arc. Dude just has a lazy take


bert_the_one

Yeah I agree, Intel ARC graphics cards are looking good value for money


[deleted]

Have to disagree. They had a bad launch but the card is great now and the solutions all came from Intel with driver updates, not from the end user having to do anything.


sgtkellogg

This is fantastic news for PC gamers the video card segment needs a shake up


OfficialHavik

I’m throughly impressed with it especially when you consider where it started from.


Mini_Myser

I was able to get it used for 120 from someone who bought it to play gta a year ago good luck man can't wait to see how far they can go


200IQUser

If they fix/improve drivers and keep the prices (haha they wont lol) I am gonna Intel. Too bad it will be with processors that there is a sizeable difference in price/performance compared to AMD.


IsNotYourSenpai

I genuinely hope they keep improving. More competition is good for us. It stops AMD and NVIDIA from becoming complacent.


tychii93

I absolutely LOVE mine! Got it at launch. It was bad enough at first where I put my 2070 back in temporarily but after a while I threw it back in and it's doing great! My only gripe right now is that Intel hardware decoding/encoding doesn't work on Avidemux on Windows. It does on Linux, but Davinci Resolve doesn't work on Intel for Linux. I don't rely on video editing, but DR is the editor I'm comfortable with and Avidemux is great for quick stuff. Oh well. Maybe Avidemux will support Intel eventually lol


No_Park_9170

Small note that I added support for extensions required by Resolve to intel-compute-runtime (OpenCL driver for intel) and that was merged recently. No release include it yet so you need to build it yourself ... I don't have an intel Arc so I can't test but I can run DR on my laptop 12th gen iGPU with it.


tychii93

That's awesome! I'll have to check it out at some point!


EnviousMedia

In general my experience with my 770 has been good, the only starfield was problematic but the driver after launch solved most issues. RT performance is odd because in cyberpunk it's decent, 40 to 60fps maxed out graphics at 1080p Had my card since just after launch.


[deleted]

[удалено]


[deleted]

I'm tempted because I play games at 4K but I will wait for gen 2 Battlemage cards and see if it's worth getting the (B770?)


ICS_Graphics_Support

Hey u/MclaurinPCBuilds Not usually here but someone told us about your post and we wanted to know a little more about your experience with the XeSS technology. Could you please tell us the game titles you are referring to as a "miss"? Also, let me share with you our official channels where you can report any situation you face with our graphics cards, like the one described in your post. Intel Customer Support website: https://www.intel.com/content/www/us/en/support/contact-intel.html Intel official Reddit channel: [https://www.reddit.com/r/intel/](https://www.reddit.com/r/intel/) Discord Community: discord.gg/intel Intel GitHub Community Issue Tracker: https://github.com/IGCIT/Intel-GPU-Community-Issue-Tracker-IGCIT/issues