Author: parsec
Subject: NVMe RAID0 Speed decreased
Posted: 14 Aug 2016 at 12:44pm
We have seen an occasional issue with performance dropping with 950 Pros, both single SSDs and RAID 0 arrays. The main one we have is a performance drop after waking from Windows Sleep. That remains a mystery, although it may be related to a certain type of registers in the CPU used by DRAM memory whose contents were not maintained when resuming from Sleep. I don't see that as related to your problem, since a simple reboot fixes this issue.
Another situation with a single 950 Pro was solved with a new Windows 10 installation. This apparently happened when a 950 Pro was connected to the mother board and used with an existing Windows installation. The reason for the performance loss was never identified, and RAID was not being used or enabled.
Since the Samsung Magician software cannot deal with RAID, you cannot get any information about each of the 950 Pros, like this, right:
![]()
You're familiar with the Z170 OC Formula, and its switches like the Slow Mode switch for the CPU that is set to On from the factory. I'm not aware of another switch or UEFI setting that would affect the speed of your RAID 0 array. I imagine you have the PCIe Remapping options set in Storage Configuration (or whatever they are called.)
You would think moving an OS installation to the identical model mother board would not cause any issues, but who knows what underlying things in the Windows Registry, for example, might be causing some weird problem.
When was the last time you ran AS SSD on the RAID 0 array in the old board? How long between running the AS SSD benchmark result of 4104, and getting the new board? I'm wondering if something happened to your RAID array before you moved to the new board? You mentioned you thought you had a corrupt main UEFI, I wonder if that is related to this situation?
I'm curious about what the differences in the AS SSD results between the two boards. Different over clocks, memory speeds, chipset and CPU power saving options, and Windows Power Plan settings alone or combined should not cause that much of a performance difference. I'd love to see both screenshots of the AS SSD results, if possible.
Did you have different UEFI versions on the old and new boards? Or is that information lost forever...
Any chance your new board has the 2.50 UEFI with the "Update NTFS module." change? I don't know if that makes a difference or not, just an observation.
RAID with NVMe PCIe SSDs is still very new, IRST version 14 is the first to give us RAID with NVMe SSDs. It's a miracle it works IMO, and thanks to Intel (and ASRock) for giving us this capability. But compared to IRST RAID with SATA drives, IRST RAID with PCIe NVMe SSDs is what I tend to call "fragile".
For example, are you aware that simply clearing the UEFI with the board's jumper or clr CMOS button will ruin the RAID 0 array if you simply start the PC into the UEFI after the UEFI/CMOS clear? One 950 Pro RAID 0 user in this forum literally physically removes his 950 Pros from his board before clearing the UEFI, and puts them back only after re-establishing all the required UEFI settings. That is the only way we know of currently of preventing the failure of a RAID array of NVMe SSDs after a UEFI/CMOS clear. This is also true when a UEFI update is applied to the board, since that sets all the UEFI options to their default values.
Bottom line, I've got nothing I can tell you that will fix it. I wish I did. I stopped using RAID 0 with my 950 Pros, because I clear and update my Z170 Extreme7+ board's UEFI all the time. One of the prices of being a moderator...
![Wink Wink]()
Subject: NVMe RAID0 Speed decreased
Posted: 14 Aug 2016 at 12:44pm
![]() A week ago I received my first Z170 OC Formula. I set it up with 3 x Samsung 950 Pro's in RAID0 as my boot/only drive. I ended up having 2 issues (BIOS A was corrupt and core Temp readings were off). I did a little bit of troubleshooting, but finally figured since I purchased from Amazon that I would just exchange for a new one. Got the new board in and put all the hardware into it, including placing all 3 NVMe drives in the same slots as they were in the previous board. I set all the same BIOS settings and booted into Windows 10, no issues. BIOS A is fine and temperatures are reading proper. It was all rather easy. However, I ran an AS SSD benchmark on my RAID0 and the benchmark dropped from 4104 on the old board to 1470 on the new one. My gut is telling me to blow the array away and rebuild it, reinstall all the software. Anybody else run into an issue like this, or have any ideas on what I could possibly be missing? |
We have seen an occasional issue with performance dropping with 950 Pros, both single SSDs and RAID 0 arrays. The main one we have is a performance drop after waking from Windows Sleep. That remains a mystery, although it may be related to a certain type of registers in the CPU used by DRAM memory whose contents were not maintained when resuming from Sleep. I don't see that as related to your problem, since a simple reboot fixes this issue.
Another situation with a single 950 Pro was solved with a new Windows 10 installation. This apparently happened when a 950 Pro was connected to the mother board and used with an existing Windows installation. The reason for the performance loss was never identified, and RAID was not being used or enabled.
Since the Samsung Magician software cannot deal with RAID, you cannot get any information about each of the 950 Pros, like this, right:

You're familiar with the Z170 OC Formula, and its switches like the Slow Mode switch for the CPU that is set to On from the factory. I'm not aware of another switch or UEFI setting that would affect the speed of your RAID 0 array. I imagine you have the PCIe Remapping options set in Storage Configuration (or whatever they are called.)
You would think moving an OS installation to the identical model mother board would not cause any issues, but who knows what underlying things in the Windows Registry, for example, might be causing some weird problem.
When was the last time you ran AS SSD on the RAID 0 array in the old board? How long between running the AS SSD benchmark result of 4104, and getting the new board? I'm wondering if something happened to your RAID array before you moved to the new board? You mentioned you thought you had a corrupt main UEFI, I wonder if that is related to this situation?
I'm curious about what the differences in the AS SSD results between the two boards. Different over clocks, memory speeds, chipset and CPU power saving options, and Windows Power Plan settings alone or combined should not cause that much of a performance difference. I'd love to see both screenshots of the AS SSD results, if possible.
Did you have different UEFI versions on the old and new boards? Or is that information lost forever...

RAID with NVMe PCIe SSDs is still very new, IRST version 14 is the first to give us RAID with NVMe SSDs. It's a miracle it works IMO, and thanks to Intel (and ASRock) for giving us this capability. But compared to IRST RAID with SATA drives, IRST RAID with PCIe NVMe SSDs is what I tend to call "fragile".
For example, are you aware that simply clearing the UEFI with the board's jumper or clr CMOS button will ruin the RAID 0 array if you simply start the PC into the UEFI after the UEFI/CMOS clear? One 950 Pro RAID 0 user in this forum literally physically removes his 950 Pros from his board before clearing the UEFI, and puts them back only after re-establishing all the required UEFI settings. That is the only way we know of currently of preventing the failure of a RAID array of NVMe SSDs after a UEFI/CMOS clear. This is also true when a UEFI update is applied to the board, since that sets all the UEFI options to their default values.
Bottom line, I've got nothing I can tell you that will fix it. I wish I did. I stopped using RAID 0 with my 950 Pros, because I clear and update my Z170 Extreme7+ board's UEFI all the time. One of the prices of being a moderator...

