Intel Arc B570 review featuring the ASRock Challenger OC: A decent budget option with a few deep cuts

Trimming performance and price on the BMG-G21 GPU.

ASRock Arc B570 Challenger OC
(Image: © Tom's Hardware)

Why you can trust Tom's Hardware Our expert reviewers spend hours testing and comparing products and services so you can choose the best for you. Find out more about how we test.

(Image credit: Tom's Hardware)

For 2025 (as well as the Arc B580 review), we've upgraded our GPU test PC and modernized our gaming test suite. The new system has an AMD Ryzen 7 9800X3D processor, the fastest current CPU for gaming purposes. We did test the B580 on our old 13900K test bed as well, and most of the results were basically the same as the 9800X3D — meaning the GPU speed is the limiting factor in most games.

We're running Windows 11 24H2, with the latest drivers at the time of testing. We used AMD's 24.12.1 drivers, Nvidia's 566.36 drivers, and Intel's preview 6256 drivers for the B570. We also retested the B580 with the publicly available 6256 drivers, while the Arc A770 used the 6319 drivers.

Note that these changes mean all the results from our GPU benchmarks hierarchy, while still valid for when they were run, need to be refreshed. We'll be working on a revised GPU hierarchy in the coming weeks, but it will be a bit before that's fully ready — we want at least all the current generation cards to be included, and it's no secret that both Nvidia Blackwell RTX 50-series GPUs and AMD RX 9000-series RDNA 4 GPUs are incoming.

Our PC is hooked up to a Samsung Odyssey Neo G8 32, one of the best gaming monitors around, allowing us to potentially experience some of the higher frame rates that might be available on the fastest GPUs. Most games can't get anywhere close to the 240 Hz limit of the monitor, especially not with budget to midrange hardware like the Arc B570 and its direct competition.

The new (revised since B580) GPU test suite consists of 22 games. We dropped Call of Duty Black Ops 6 from the suite due to frequent changes and some other oddities, and we're still looking at other potential changes, but this is where we're at for now. We've also toned down on ray tracing tests, mostly because outside of a few select games, it often seems to kill performance for debatable image quality upgrades. So, while more of the games have RT support, it's only enabled in six of the games — and even then, the visual upgrades are only really noticeable in three of the games. The remaining 16 games are run in pure rasterization mode.

All 22 games were tested without any upscaling or frame generation. We'll see about doing additional XeSS testing in the future, but trying to compare DLSS 2/3 (and soon 4), FSR 2/3, and XeSS performance without accounting for differences in image quality strikes us as a bad baseline way of measuring performance. Plus, we'd rather the default in games be native rendering, leaving upscaling and framegen as true performance boosting options — so you can break 120 fps or 144 fps, rather than just trying to get to 60 fps.

All games are tested using 1080p 'medium' settings (the specifics vary by game and are noted in the chart headers), along with 1080p, 1440p, and 4K 'ultra' settings. Some may wonder about the reasoning behind the selected settings, so let's quickly elaborate.

What we want to show with graphics cards is how performance scales. We include 1080p medium as a baseline "everything released in the past few years ought to handle this" setting. Then moving to 1080p ultra provides enough of a gap to be interesting — sometimes it's still only 10% slower, but other games it might be half as fast as medium settings. If we tested 1080p high instead, that's potentially one less useful piece of information.

Going beyond 1080p ultra, we don't want to change both the resolution and the settings, as there's going to be a lot of overlap between 1440p medium and 1080p ultra as an example. So we just test 1440p and 4K ultra, at least where it makes sense. And keep in mind that today's ultra is tomorrow's high, the next day's medium, and next week's low — except it's more like a year or so between each level.

The end result is that our tests will show both how GPUs run at comparable settings, where some designs may have shortcomings (e.g. insufficient VRAM or bandwidth), and provide ways for people to extrapolate how things would run at other settings. While we don't test 1440p or 4K at medium settings, if you check the 1080p medium to ultra scaling on a slower GPU from the same vendor, that should also apply (roughly) to a higher tier GPU at higher resolutions.

As we're in the process of retesting everything on our new PC and test suite, we're toning down the number of comparison points. The most direct competition for the B570 is a bit hard to pin down. The RTX 4060 and RX 7600 XT clearly cost more, but we'll include them as the "step up" options alongside the B580. The RX 7600 also costs a bit more, but it's fairly close. Then we drop to the RX 6600 and RTX 3050 as sub-$200 options that cost less than the B570. Besides those cards, we also have Nvidia's RTX 3060 12GB (which isn't so readily available these days), and we also tested the Arc A770 and A750.

The primary competition for the B570 ends up being older GPUs that are on their way out, and we don't really expect any new AMD or Nvidia GPUs to target the sub-$250 market. Maybe we'll be wrong, but we suspect the eventual RTX 5060 will likely cost $299 or more, and RX 9060 will likewise probably cost $299 or more. That at least gives Intel a clear win as the least expensive new graphics card. If you want to get an idea of where other GPUs might land, check out our full GPU benchmarks hierarchy — and then use the percentage increase in the hierarchy and apply that to the test data from this review.

Our test PCs are now running Windows 11 24H2, with all the updates applied. We're also using Nvidia's PCAT v2 (Power Capture and Analysis Tool) hardware, which means we can grab real power use, GPU clocks, and more during our gaming benchmarks. We'll cover those results on the page with power use.

Finally, because GPUs aren't purely for gaming these days, we've run some professional and AI application tests. We've previously tested Stable Diffusion, using various custom scripts, but to level the playing field we're turning to standardized benchmarks. We use Procyon, and run the AI Vision test as well as the Stable Diffusion 1.5 and XL tests; MLPerf Client 0.5 preview for AI text generation; SPECworkstation 4.0 for Handbrake transcoding, AI inference, and professional applications; 3DMark DXR Feature Test to check raw hardware RT performance; and finally Blender Benchmark 4.3.0 for professional 3D rendering.

Jarred Walton

Jarred Walton is a senior editor at Tom's Hardware focusing on everything GPU. He has been working as a tech journalist since 2004, writing for AnandTech, Maximum PC, and PC Gamer. From the first S3 Virge '3D decelerators' to today's GPUs, Jarred keeps up with all the latest graphics trends and is the one to ask about game performance.

  • Gururu
    If this becomes more available than the b580, I would happily put this into my little brother or sister's new build. $200-250 is absolutely budget and I guess the performance is better than integrated solutions.
    Reply
  • GenericUser2001
    Any thoughts on doing a performance test of this and the B580 using a more budget processor? Quite a few other websites have been retesting the B580 and found that it has some sort of driver overhead issues, and when paired with more modest CPU like a Ryzen 5600 the B580 often ends up falling behind a Radeon 7600 or Geforce 4060 on the same games it leads in when paired with a high end CPU.
    Reply
  • Elusive Ruse
    Thanks for the review Jarred, I like that you don’t skip higher resolutions and RT which might not be as relevant for a budget GPU but in my opinion they offer good insight on overall improvement gen-on-gen.

    The price point is pretty good and I think many buyers would rather buy a new release with potential to get higher performance in the future with better drivers than buying a used card or an older generation card for the same money and performance.
    Reply
  • das_stig
    am I misinterpreting the chart or why buy a B5x0 when the A7x0 is superior in most things including price, except for extra wattage and boost clock?
    Reply
  • Notton
    das_stig said:
    am I misinterpreting the chart or why buy a B5x0 when the A7x0 is superior in most things including price, except for extra wattage and boost clock?
    If you're looking at the same charts I am looking at, yes.
    B570 > A750, B580 > A770 at a majority of games.
    There are some exceptions where this flips around on some settings, like TLoU 1080p ultra, but reverts to B570 being dominant at 1080p medium.
    Reply
  • Giroro
    B570 Doesn't really outperform an RTX 3060. That a bummer, even at $200.
    Reply
  • eye4bear
    Day before yesterday I managed to order and pick-up after work one of only 3 B580s at the Miami Micro Center, and the other two were gone yesterday on their web-site. Worked late last evening, so haven't had a chance yet to install it. Replacing an Arc A380. If I find out anything interesting, will let you all know.
    Reply
  • JarredWaltonGPU
    GenericUser2001 said:
    Any thoughts on doing a performance test of this and the B580 using a more budget processor? Quite a few other websites have been retesting the B580 and found that it has some sort of driver overhead issues, and when paired with more modest CPU like a Ryzen 5600 the B580 often ends up falling behind a Radeon 7600 or Geforce 4060 on the same games it leads in when paired with a high end CPU.
    It all takes time, the one thing I definitely don't have right now. There's a reason RTX 3050 isn't in the charts either. LOL. But eventually, it's something I'd like to investigate... and will probably be stale before I could get around to it. Because it's time to start testing the extreme GPUs in preparation for RTX 5090 and 5080. And after that? The high-end cards in preparation for RTX 5070 Ti and 5070, plus RX 9070 XT and 9070.

    I should have more ability to do off the beaten path testing in about two months, in other words. <sigh> But it's good to be busy, even if we don't have enough time between getting cards and the launch dates.
    Reply
  • -Fran-
    Thanks for the comprehensive data as always, Jarred.

    And kind of sad the conclusion from most people reviewing it is: "well, the B580 is the better pick if you can find it at MSRP". I wonder if Intel can make this card hit a lower price point? I mean, without actually losing money. Sounds tricky to do.

    And I'm surprised OBS didn't work for you. I would have imagined they'd be exposing the capabilities of Battlemage the same way as Alchemist for the encoders. Well, I hope a patch is coming, since that's a big miss for me at least :(

    Regards.
    Reply
  • rluker5
    I've got a B580 and noticed a couple of bugs in overclocking.
    1. my PC doesn't like to wake from sleep with an overclock applied to the B580. It will wake, not be happy and restart which turns off the oc. No problem if no oc. I am running a pretty heavy undervolt on my 13900kf and it is stable in everything else, but maybe is giving this particular boot issue. Also not a fresh OS install.
    2. The ram oc usually doesn't take 21 Gbs right away. I have to do 20, sometimes 20.1 then it takes 21 and the change shows up in GPUZ and everything else.

    I just thought of the ram oc finickyness reading this article and how I would want to oc vram if I had a B570. Hopefully few others have these issues but I'm seeing them so I brought them up.

    Also my B580 has been a bunch faster than my A750 in the few games I've played on it.
    Reply