Last active
December 25, 2023 21:33
-
-
Save jmftrindade/ed3d4fad7bb7f4f1683c89c20a74c4cc to your computer and use it in GitHub Desktop.
Revisions
-
jmftrindade revised this gist
Dec 25, 2023 . 1 changed file with 1 addition and 0 deletions.There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters. Learn more about bidirectional Unicode charactersOriginal file line number Diff line number Diff line change @@ -9,6 +9,7 @@ Assumptions: - You're using Windows 10 as the OS. Might work on Windows 11 too, but I haven't tested. - You have an SSD already using one of the two M2 slots (if you don't, other PCIEx bifurcation configs might still be an option for you, such as e.g., the 1x8 + 2x4, or the 2x8). - You already have "Initial Display Output" set to "PCIe 1 Slot" on your BIOS (default setting). - The two GPU cards you want to install are Gen 4.0 (they're both different editions of RTX 3060 12GB OC, in my case). - You know what you're doing, e.g., whenever you're opening your box to add/remove cards, PSU is turned off, unplugged from outlet, you know how to insert / remove GPU cards and plug eVGA PSU cables to them, etc etc. BIOS config you'll need, in addition to keeping PCIe 1 Slot as Initial Display Output: -
jmftrindade revised this gist
Dec 25, 2023 . 1 changed file with 1 addition and 1 deletion.There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters. Learn more about bidirectional Unicode charactersOriginal file line number Diff line number Diff line change @@ -1,6 +1,6 @@ A [Google search](https://www.google.com/search?q=x570+aorus+master+won%27t+detect+second+gpu) shows how tricky it can be to get an X570 Aorus motherboard to correclty detect your GPU, especially if you're already using one of the PCIe slots for a GPU + another M2 PCIe slot for an NVMe SSD, which was the case for my build. Therefore, neither my BIOS nor my OS detected the 2nd GPU card right away. I spent some time reading the [motherboard manual](https://download.gigabyte.com/FileList/Manual/mb_manual_x570-aorus-master_1002_e.pdfhttps://download.gigabyte.com/FileList/Manual/mb_manual_x570-aorus-master_1002_e.pdf), but nothing there was immediately helpful. The section on PCIe did mention how to split the available lanes in case multiple were being used for PCIe Gen 4.0 devices (which by default will use x16 lanes), and that's what we rely on here via the "bifurcation" BIOS feature listed below. I got it to work successfully, so sharing it here in case somebody out there finds this useful. -
jmftrindade revised this gist
Dec 25, 2023 . 1 changed file with 16 additions and 12 deletions.There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters. Learn more about bidirectional Unicode charactersOriginal file line number Diff line number Diff line change @@ -1,16 +1,17 @@ A [Google search](https://www.google.com/search?q=x570+aorus+master+won%27t+detect+second+gpu) shows how tricky it can be to get an X570 Aorus motherboard to correclty detect your GPU, especially if you're already using one of the PCIe slots for a GPU + another M2 PCIe slot for an NVMe SSD, which was the case for my build. Therefore, neither my BIOS nor my OS detect the 2nd GPU cardr right away. I spent some time reading the [motherboard manual](https://download.gigabyte.com/FileList/Manual/mb_manual_x570-aorus-master_1002_e.pdfhttps://download.gigabyte.com/FileList/Manual/mb_manual_x570-aorus-master_1002_e.pdf), but nothing there was immediately helpful. The section on PCIe did mention how to split the available lanes in case multiple were being used for PCIe Gen 4.0 devices (which by default will use x16 lanes), and that's what we rely on here via the "bifurcation" BIOS feature listed below. I got it to work successfully, so sharing it here in case somebody out there finds this useful. Assumptions: - You already ensured that your PSU can handle both cards. If not, use PC Part Picker to estimate the minimum wattage needed. For my build, a 850W PSU suffices ([see parts list](https://pcpartpicker.com/user/jmftrindade/saved/cwnQRB)). - You're using Windows 10 as the OS. Might work on Windows 11 too, but I haven't tested. - You have an SSD already using one of the two M2 slots (if you don't, other PCIEx bifurcation configs might still be an option for you, such as e.g., the 1x8 + 2x4, or the 2x8). - You already have "Initial Display Output" set to "PCIe 1 Slot" on your BIOS (default setting). - You know what you're doing, e.g., whenever you're opening your box to add/remove cards, PSU is turned off, unplugged from outlet, you know how to insert / remove GPU cards and plug eVGA PSU cables to them, etc etc. BIOS config you'll need, in addition to keeping PCIe 1 Slot as Initial Display Output: Settings tab > "IO Ports": * PCIEX16 Bifurcation -> PCIE 4x4 @@ -21,12 +22,15 @@ Settings tab > "Miscellaneous": * PCIe Slot Configuration -> Auto * IOMMU -> Auto Sequence of steps to get Windows to recognize both your cards. : 1. Make sure you only have a GPU card on the PCIe 2 slot (the one in the middle; could work on PCIe 3, but here I don't have enough space to use it). Plug in peripherals (including HDMI monitor to the GPU), and boot. 2. Windows recognizes it (listed under "Display Adapters" on "Device Manager), and either automagically install NVIDIA driver on the card, or gives you a chance to do it. All good, turn server off. 3. Open your box, and place your 2nd GPU on the PCIe 1 slot. Leave the one HDMI monitor that was already plugged to the first GPU card (on PCIe 2 slot), and plug a second HDMI monitor to the card you just placed on PCIe 1 slot. 4. Turn server on, booting to Windows: it should now recognize the 2nd card and install NVIDIA driver on it (two entries under "Display Adapters" on "Device Manager" now). Confirm visually that both monitors have signal. You should now be all set. -
jmftrindade created this gist
Dec 25, 2023 .There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters. Learn more about bidirectional Unicode charactersOriginal file line number Diff line number Diff line change @@ -0,0 +1,32 @@ A [Google search](https://www.google.com/search?q=x570+aorus+master+won%27t+detect+second+gpu) shows how tricky it can be to get an X570 Aorus motherboard to correclty detect your GPU, especially if you're already using one of the PCIE slots for a GPU + another M2 slot for an SSD. Mine also didn't detect the 2nd GPU right away. I spent some time reading the [motherboard manual](https://download.gigabyte.com/FileList/Manual/mb_manual_x570-aorus-master_1002_e.pdfhttps://download.gigabyte.com/FileList/Manual/mb_manual_x570-aorus-master_1002_e.pdf), but nothing there was immediately helpful. Eventually I got it to work, so sharing it here in case somebody out there finds this useful. Assumptions: - You already ensured that your PSU can handle both cards. If not, use PC Part Picker to estimate the minimum wattage needed. For my build, a 850W PSU suffices ([see parts list](https://pcpartpicker.com/user/jmftrindade/saved/cwnQRB)). - You're using Windows 10 as the OS. Might work on Windows 11 too. - You have an SSD already using one of the two M2 slots (if you don't, other PCIEx configs might still be an option for you). - You already have "Initial Display Output" set to "PCIe 1 Slot" on your BIOS (default). BIOS Config you'll need, in addition to keeping PCIe 1 Slot as Initial Display Output: Settings tab > "IO Ports": * PCIEX16 Bifurcation -> PCIE 4x4 * Above 4G Decoding -> Enabled Settings tab > "Miscellaneous": * PCIEX16 Slot Configuration -> Auto * PCIe Slot Configuration -> Auto * IOMMU -> Auto Sequence of steps to get Windows to recognize both your cards: 1. Remove both cards, make sure you only have a GPU card on the PCIe 2 slot (the one in the middle; could work on PCIe 3, but here I don't have enough space to use it). Plug in peripherals (including HDMI out to the GPU), and boot. 2. Windows recognizes it (listed under "Display Adapters" on "Device Manager). Install NVIDIA driver on the card. Reboot. 3. Unplug your server from outlets, turn off PSU, remove PSU outlet cable, open your box, and place your 2nd GPU on the PCIe 1 slot. Leave the one HDMI monitor that was already plugged to the first GPU card (on PCIe 2 slot), and add a second HDMI monitor to the card you just placed on PCIe 1 slot. 4. Boot to Windows: it should now recognize the 2nd card and install NVIDIA driver on it (two entries under "Display Adapters" on "Device Manager" now).