The original post: /r/datahoarder by /u/Electronic-Papaya on 2024-06-12 17:15:11.
I'm trying to free up some PCIe slots in my system so I'm switching from 2 x LSI 2008 to a single 16 port LSI 9300-16i. I'm running Linux and using mdadm to run 3 arrays.
Before I attached any of my drives with data and arrays configured, I installed the card with no drives attached and update the firmware to 16.00.12.00, and made sure it was in IT mode. As a test I then connected one of my arrays to the controller and booted the system. After booting up, the drives were detected fine (4 x 2TB) but the array was gone. It appears that the metadata was erased, mdadm didn't recognize any drive as being part of an array.
I was able to recover the data and the array by following the steps here: https://raid.wiki.kernel.org/index.php/Recovering_a_damaged_RAID
However, if I reboot again the same thing happens, the metadata is lost and mdadm does not recognize that the drives are part of an array. I do not have this issue with the older LSI controllers.
Any idea what's going on here? When I created the array I used the entre drive, so I did not create a Linux RAID partition on each drive. The array is configured using /dev/sda to /dev/sdd, and not /dev/sda1 to /dev/sdd1. Not sure that has anything to do with it.
Edit: Seems to be an issue with the controller and GPT partition tables. As a test, I created an array with a couple 120gb SSD's I had laying around. I created the array with the whole drive, rebooted, and after a reboot the array was still present. I realized my other drives are configured as GPT. So I wiped the SSDs, switched them to GPT and again created the array. This time after a reboot the array was gone, mdadm does not recognize the drives as being part of an array. Not sure how to fix this.