MAG X870 TOMAHAWK WIFI Beta BIOS

Reporting this as it helped, so you have more data points.

AMD 9800X3D
MSI MPG X870E CARBON Wifi - Updated to what is currently on support site reported as latest BIOS: 7E49v1A23
Corsair Vengeance DDR5 6000 CL30 96GB (2x48) (CMK96GX5M2B6000Z30) (Rated for and Test It Out: 30-36-36-76, stable)
No GPU - can't get ahold of anything in stock :-)cry:)
2x WD BLACK 1TB SN850X PCIe Gen4 NVMe SSD in slots 1 and 3 (skipped 2 to avoid PCIe lane splitting).
Cooler Master 360 Atmos AIO - did not hook up any RGB LEDs.
Fractal North XL

Upon setting AHCI -> Raid, I was stuck in the infinite loop. Clearing CMOS did not allow me to exit the loop, continued to reboot. POST looked like it was successful at first, however keyboard input was no longer accepted and could not "delete" into BIOS when prompted.

I downloaded the E7E51AMSI.1A26 beta BIOS and renamed to MSI.ROM at USB root (per motherboard manual instructions), flashed with easy flash button on back of motherboard.

Upon flashing, successfully allowed me to enter BIOS. CMOS were reset to default as well.

I was able to set AHCI -> Raid and successfully re-enter BIOS. I then had control over raid arrays. Unfortunately (and this is likely a separate issue, but unsure) I was unable to get windows installer to recognize the raid array based on the two NVMe SSD.

Thank you for posting this beta.
 
Reporting this as it helped, so you have more data points.

AMD 9800X3D
MSI MPG X870E CARBON Wifi - Updated to what is currently on support site reported as latest BIOS: 7E49v1A23
Corsair Vengeance DDR5 6000 CL30 96GB (2x48) (CMK96GX5M2B6000Z30) (Rated for and Test It Out: 30-36-36-76, stable)
No GPU - can't get ahold of anything in stock :-)cry:)
2x WD BLACK 1TB SN850X PCIe Gen4 NVMe SSD in slots 1 and 3 (skipped 2 to avoid PCIe lane splitting).
Cooler Master 360 Atmos AIO - did not hook up any RGB LEDs.
Fractal North XL

Upon setting AHCI -> Raid, I was stuck in the infinite loop. Clearing CMOS did not allow me to exit the loop, continued to reboot. POST looked like it was successful at first, however keyboard input was no longer accepted and could not "delete" into BIOS when prompted.

I downloaded the E7E51AMSI.1A26 beta BIOS and renamed to MSI.ROM at USB root (per motherboard manual instructions), flashed with easy flash button on back of motherboard.

Upon flashing, successfully allowed me to enter BIOS. CMOS were reset to default as well.

I was able to set AHCI -> Raid and successfully re-enter BIOS. I then had control over raid arrays. Unfortunately (and this is likely a separate issue, but unsure) I was unable to get windows installer to recognize the raid array based on the two NVMe SSD.

Thank you for posting this beta.
Did you download the RAID installation drivers from AMD? You'll need to load drivers to recognize your RAID array during Windows install:
Check under the "On-Board PIDE/SATA Drivers". There should be 3 drivers in the zip and you need to load two of them into Windows to recognize your array, I believe RCBottom and RCRAID, if I recall. I don't remember having to load RCCFG but it has been a while.
That being said, I would recommend against installing in RAID unless you're doing it for mirroring, the RAID1 and RAID5 configs don't give enough benefits. Both hide your drive's hardware from things like direct SMART access or firmware upgrades, so it's quite a hassle to upgrade that when required, and the speed penalty through the chipset is fairly significant. With just 2 drives in RAID0 you'll see something like... 150% speed increase sequentially, possibly less, because it's going through the chipset to access the drives. The penalty is even greater if you have more than 2 drives since you can only access one drive at a time through the chipset.
RAID5 configs suffer from too much speed and replacing a drive is a tricky maneuver, and again, suffer a significant speed decrease because of the multiple drives on the chipset AND calculating parity.
If you're just looking for fun benchmarks or to see if you can do it, go for it! But for a daily driver/gaming machine, you'll likely come to regret the inconveniences that are traded for that little bit of speed.
Also, did you download the E7E51AMSI.1A26 Beta bios from this page and apply it to am MPG x870E Carbon Wifi mainboard that you listed? I'm surprised they're similar enough to cross-flash the BIOS, but you should be applying the E7E49AMSI.1A26 Beta BIOS from the Carbon WIFI beta BIOS page here:
 
Last edited:
I see, but my notice was to [christopher.b.amundso157f02e0] only,
who has a different board and seems he is flashed (or trying to flash) to the wrong BIOS.
 
What changed to last Bios?

I have the same Problems as usual.
First Boot: Hang with 00 in Display
2nd Boot: Hang with 84 in Display
3rd Boot: Hang with 85 in Display

After getting into Bios, and enable options for my system.

Network-Card in PCIe_2: Not initializing
MVMe in Slot #3: Not initializing

Pffttt..... Thank you
 
What changed to last Bios?

I have the same Problems as usual.
First Boot: Hang with 00 in Display
2nd Boot: Hang with 84 in Display
3rd Boot: Hang with 85 in Display

After getting into Bios, and enable options for my system.

Network-Card in PCIe_2: Not initializing
MVMe in Slot #3: Not initializing

Pffttt..... Thank you
What's your config? Did you load default values after the BIOS upgrade (from the exit menu) and try booting from those? I've found that the BIOS looks reset but there's still been some gunk in there and the loading default values has fixed some problems for me.
The x870 Tomahawk shares some PCIe bandwidth between the M2_3 and PCIe_3 slots, so you might be seeing issues with that, in that whatever NIC you're dropping in there can't run with only 2 channels because it's not able to bifurcate from 4. Per the manual:
PCI_E3 slot will run at x2 speed when installing device in the M2_3 slot. You can switch PCI_E3 slot to x4 in the BIOS, but this will disable the M2_3 slot.
 
The System is working. From time to time. Not reliable.
Im aware of Lane Sharing, but Gen3x2 is still faster than SATA an spares me two cables.

Specs
9800x3D with 2x32GB G.Skill 5600 CL30 @ AXMP
PCIe Slot #1: RTX 4090 @ Gen4x16
PCIe Slot #2: Asus XG-C100C 10GbE Nic @Gen3x1
PCIe Slot #3: PCIe-Adapter on nvme @Gen3x4 (Intel SSD)

NVME Slot #1: Adata Legend 970 Pro @Gen5x4
NVME Slot #2: Gigabyte Aorus @Gen4x4
NVME Slot #3: Samsung 970 Evo @Gen3x2
NVME Slot #4: Samsung 970 Evo @Gen3x2

Now ... I do not use USB4, its disabled in Bios to get NVME Slot #2 from x2 zu x4. Works
BUT ... I can put the Sharing for SSD Slot #3 to whatever I want. Sometimes it works. Sometimes it do not work. Regardless the SSD inside. Samsung, Gigabyte, Intel ... does not matter for the case. It works sometimes and sometimes I get 84, 85 or 86 on Boot and it does not work.

The Asus 10GbE Nic works flawlessly (but without full speed, it gets about 7Gigabits with one Lane, but it works while the onboard-Realtec-Lan crashes as soon as I install the realtec-driver, same cable, same switch the Asus work with) IF it was found on initialisation in Bios. And there is the same Problem like the SSD Slot #3. It does not matter what I put in Bios, It does not matter what card is inside the PCIe_2-Slot. Sometimes the Bios-Initialisation works, and the Card works, and sometimes the Bios did not initalize the card, and the Card does not work. I tried a USB3-Card for example (PCIe 1x). Same picture. Sometimes.

I bought this Mainboard last year, and I hope for every bios-update to solve this problems.
No....
 
Last edited:
Apologies, Svet. You are correct, wrong topic, wrong bios. I was going off of very little sleep trying to debug that infinite bios loop from the night before.

Good news though, I actually downloaded E7E49AMSI.1A26, from this thread:

Motherboard is not bricked, installed OS fine, all drivers
for the x870e installed fine as well, system was stable and is waiting
on a GPU.

I'll edit my first post in this thread with the correction so others don't see it and it cause an issue.
 
Bummer, edit button isn't there any longer. Can't update it. If an admin wants to X that out with a link to my post just above this, would help.

Did you download the RAID installation drivers from AMD? You'll need to load drivers to recognize your RAID array during Windows install:
Check under the "On-Board PIDE/SATA Drivers". There should be 3 drivers in the zip and you need to load two of them into Windows to recognize your array, I believe RCBottom and RCRAID, if I recall. I don't remember having to load RCCFG but it has been a while.
That being said, I would recommend against installing in RAID unless you're doing it for mirroring, the RAID1 and RAID5 configs don't give enough benefits. Both hide your drive's hardware from things like direct SMART access or firmware upgrades, so it's quite a hassle to upgrade that when required, and the speed penalty through the chipset is fairly significant. With just 2 drives in RAID0 you'll see something like... 150% speed increase sequentially, possibly less, because it's going through the chipset to access the drives. The penalty is even greater if you have more than 2 drives since you can only access one drive at a time through the chipset.
RAID5 configs suffer from too much speed and replacing a drive is a tricky maneuver, and again, suffer a significant speed decrease because of the multiple drives on the chipset AND calculating parity.
If you're just looking for fun benchmarks or to see if you can do it, go for it! But for a daily driver/gaming machine, you'll likely come to regret the inconveniences that are traded for that little bit of speed.
Also, did you download the E7E51AMSI.1A26 Beta bios from this page and apply it to am MPG x870E Carbon Wifi mainboard that you listed? I'm surprised they're similar enough to cross-flash the BIOS, but you should be applying the E7E49AMSI.1A26 Beta BIOS from the Carbon WIFI beta BIOS page here:

Thanks Alyred, yes, that's the one I flashed.

And also, I did attempt to load those drivers during the windows install after the raid array was created with two drives, unfortunately, no success.

It's alright though. The reason I wanted to try it was that I have memories of RAID0 with two SATAIII SSDs. Load times in Path of Exile were incredibly fast, much faster than a single drive. I wanted to try it out with NVMe RAID0 (2 drives) just to see what the experience was like, but with a warning like that I'm having second thoughts.

Do you know if SATA BIOS RAID suffer from lack of TRIM support as well? If so, I didn't notice any performance impact over a few years worth of use. Only things that were installed were windows 10 and windows updates along with a few games and graphics driver updates. It may not have hit a critical point requiring trim, the drives were very large for the time, I think 1tb. With modern NVMe speeds I wonder if it would even be a comparison though.
 
Bummer, edit button isn't there any longer. Can't update it. If an admin wants to X that out with a link to my post just above this, would help.



Thanks Alyred, yes, that's the one I flashed.

And also, I did attempt to load those drivers during the windows install after the raid array was created with two drives, unfortunately, no success.

It's alright though. The reason I wanted to try it was that I have memories of RAID0 with two SATAIII SSDs. Load times in Path of Exile were incredibly fast, much faster than a single drive. I wanted to try it out with NVMe RAID0 (2 drives) just to see what the experience was like, but with a warning like that I'm having second thoughts.

Do you know if SATA BIOS RAID suffer from lack of TRIM support as well? If so, I didn't notice any performance impact over a few years worth of use. Only things that were installed were windows 10 and windows updates along with a few games and graphics driver updates. It may not have hit a critical point requiring trim, the drives were very large for the time, I think 1tb. With modern NVMe speeds I wonder if it would even be a comparison though.
If I recall, TRIM was only an issue on earlier SATA SSDs but the issue was largely fixed later on, both through the controller and through another mechanism on the drives themselves, so I don't think that's something that you need to worry about these days - but it's been a little while since I looked into it.

I ran RAID0 on my x570 configuration with two 2TB Samsung 980 Pros. While I could get sequential performance around 12,500 MB/s the real-world performance difference was fairly unnoticeable. You might shave a couple of seconds off of load times and whatnot but the downsides were definitely a PITA to overcome, and forget any kind of proactive monitoring. Going to 3 drive RAID0 is definitely a waste of time as you're waiting on the signal over the x4 PCIe on the chipset for 2 out of the 3 drives with every read and write.
 
New Knowledge.
I just bought a new Graphics-Card which rendered the x1 PCIe-Slot (Slot PCIE_2) useless, because its so big, that it overlaps the slot.
Until then ... no Boot-Problems with 84,85 and 86 in Segment-Display.

Seems like the systems stumbles upon the occupied x1-Slot.
Nice :-)
 
Last edited:
Ne Knowledge.
I just bought a new Graphics-Card which rendered the x1 PCIe-Slot (Slot PCIE_2) useless, because its so big, that it overlaps the slot.
Until then ... no Boot-Problems with 84,85 and 86 in Segment-Display.

Seems like the systems stumbles upon the occupied x1-Slot.
Nice :-)
This isn't a BIOS or Motherboard problem.
 
It is, because it does not matter what card is set in Slot #2, if its occupied I get 84, 85 and 86 frequently in combination with SSD in Slot #3 does not show up. Without it boots without problems, and SSD in Slot #3 is always initialized.
I tried several differend cards.

The Bios/Mainboard is unable to utilize all SSD-Slots and PCIe-Slots together if all slots are occupied.
Just my opinion. But Im willing to learn. Enlight me, what the problem really is please.
 
Last edited:
It is, because it does not matter what card is set in Slot #2, if its occupied I get 84, 85 and 86 frequently in combination with SSD in Slot #3 does not show up. Without it boots without problems, and SSD in Slot #3 is always initialized.
I tried several differend cards.

The Bios/Mainboard is unable to utilize all SSD-Slots and PCIe-Slots together if all slots are occupied.
Just my opinion. But Im willing to learn. Enlight me, what the problem really is please.
Apologies then, I wasn't able to follow your posts logic and it looked like you were saying that there was a problem with the motherboard simply because your larger graphics card blocked the PCI_E2 slot.
So you're saying you get errors 84, 85, and 86 frequently when using both PCI_E2 slot and the M2_3 port together? What hardware (card brands and models) are you using that causes the problem? Does it ever boot properly or do you get those codes every time?
I know the Tomahawk x870 shares bandwidth between M2_3 and PCI_E3, which runs X2 on the PCI_E2 slot (unclear from the manual if that means M2_3 runs at x2) but that shouldn't impact PCI_E2. The manual doesn't list any codes that start with 8. Probably will need to list more about your system specs.
 
This is a new issue in the A3 BIOS. After running CMOS CLEAR and entering the BIOS settings screen, the system forcefully reboots once after 10–15 seconds. Has this problem been observed by others? Additionally, the issue where the PCIe version is stuck at 1.0–2.0, causing slow boot times and stuttering, still hasn’t been resolved in A3. This issue has been reported by many users combining MSI X870E/X870 with X3D CPUs and RTX 4000/5000 GPUs. Is there any hope for a fix?
 
Back
Top