Install Windows on NVMe: Why Setup Can’t See the Drive (and the Fix)

Was this helpful?

You boot the Windows installer. You reach “Where do you want to install Windows?” and… nothing. No disks. Just an empty list and a growing suspicion you’re about to spend your evening learning new swear words.

This is one of those problems that feels mystical until you realize it’s usually a single switch: firmware storage mode, a controller abstraction like Intel VMD, or a driver Windows Setup doesn’t have. The fix is typically boring. Also, the boring fix is usually the correct fix.

The mental model: why Setup “can’t see” an NVMe

Windows Setup is not a mind reader. It sees storage through whatever controller stack the firmware exposes and whatever drivers exist in the Windows Preinstallation Environment (WinPE) image on your USB/ISO. If your NVMe is behind a controller mode that needs a vendor driver, Setup will stare into the void.

The most common trap in 2020s hardware is that your NVMe isn’t “missing” at all. It’s being presented through a layer such as Intel VMD (Volume Management Device) or a RAID mode (Intel RST / “RAID On”), which changes what the disk looks like on the PCIe bus and which driver is needed to talk to it. On some laptops and many business desktops, the default is optimized for fleet management, not your weekend reinstall.

There are also classic mismatches:

  • UEFI vs Legacy boot: you boot the installer in one mode while the disk is partitioned for another, or the firmware refuses to hand off the NVMe properly.
  • Partitioning constraints: dynamic disks, leftover metadata, weird OEM recovery layouts, or stale RAID metadata can hide the disk from Setup or block installs.
  • Lane sharing and physical topology: that M.2 slot shares lanes with a SATA port or another slot; one is disabled when the other is populated.
  • Security features: Secure Boot is usually fine, but some storage stacks (and some “enterprise” BIOS policies) require signed drivers that aren’t in your installer image.

One quote worth keeping in your head when you troubleshoot: “Hope is not a strategy.” — James Cameron. It’s not strictly an SRE quote, but it fits operations work uncomfortably well. The right move is to measure what the installer can see, then change one variable at a time.

Joke #1: Windows Setup isn’t ignoring your NVMe to be rude. It’s just doing that thing where it pretends it can’t see you until you bring the right driver.

Fast diagnosis playbook (check these first)

If you want the shortest path to “disk appears, install continues,” do this in order. Don’t bounce randomly between BIOS menus and driver folders. That’s how time disappears.

1) Confirm the NVMe exists at the firmware level

  • In BIOS/UEFI, look for NVMe storage information, or a storage page listing the NVMe model.
  • If BIOS doesn’t see it, Windows Setup won’t either. Stop and troubleshoot hardware/slot/lane sharing first.

2) Check storage mode: AHCI vs RAID/VMD

  • If you see Intel RST, RAID On, or VMD enabled, assume Setup needs a driver or you need to switch to a mode Windows supports out-of-the-box.
  • For a single NVMe drive, prefer AHCI / NVMe without VMD unless you have a reason not to (BitLocker fleet policies and some corporate images are reasons).

3) Boot the installer in UEFI mode (almost always)

  • If your target is Windows 11, you’re in UEFI territory anyway.
  • UEFI boot + GPT is the stable, modern default. Legacy/CSM is where ancient ghosts live.

4) In Setup, open a shell and check what storage is visible

  • Use diskpart and see whether any disks appear.
  • If diskpart sees the disk but Setup doesn’t show it, it’s typically a partitioning/metadata issue.
  • If neither sees it, you’re in driver/controller mode land.

5) Decide: flip firmware setting vs load driver

  • Personal machine / clean install: disable VMD/RAID, use AHCI/NVMe, proceed.
  • Corporate-managed endpoint: load the approved storage driver so you stay compatible with the org’s imaging and recovery tooling.

Interesting facts and context (so the weirdness makes sense)

  • NVMe was built to replace AHCI: AHCI was designed around spinning disks; NVMe was designed for low latency and parallelism on PCIe.
  • Early NVMe boot support was messy: older UEFI implementations needed firmware updates to boot NVMe reliably, especially on 2014–2016-era boards.
  • Intel RST predates widespread NVMe: the tooling and assumptions started in SATA RAID days, then expanded to “manage” NVMe behind abstraction layers.
  • VMD exists for a reason: it can standardize hotplug/error handling and support enterprise-ish features, but it also changes which driver sees the disk.
  • Windows Setup runs WinPE: the driver set is smaller than a fully installed Windows, so “works in my OS” doesn’t imply “works in Setup.”
  • GPT vs MBR isn’t just aesthetics: UEFI systems expect GPT for clean boot behavior; MBR is a compatibility relic with sharp edges.
  • Some OEM images ship with preloaded RST drivers: you wipe the drive, lose the magic, and then wonder why a vanilla ISO can’t see storage.
  • Lane sharing is not hypothetical: many boards disable SATA ports when an M.2 slot is populated because they share chipset resources.

The real root causes (and what they look like)

Cause A: Intel VMD / RAID mode is enabled

What you see: BIOS lists the NVMe, but Windows Setup and diskpart show no disks. Or you see an “Intel RAID” controller but no physical disk.

Why it happens: VMD makes the NVMe appear behind a virtual controller. Without the correct Intel RST/VMD driver in WinPE, the device is effectively invisible.

Fix: Disable VMD / set storage to AHCI (if you don’t need RAID), or load the correct storage driver during Setup.

Cause B: You booted the installer in the wrong mode

What you see: Disk appears, but you can’t install; Setup complains about partition style (“Windows cannot be installed to this disk. The selected disk is of the GPT partition style.” or the MBR equivalent).

Why it happens: Legacy boot expects MBR. UEFI expects GPT. Mixing them produces predictable pain.

Fix: Boot the USB in UEFI mode, then convert the disk to GPT (or vice versa, but don’t, unless you’re reviving a museum exhibit).

Cause C: Stale metadata (old RAID headers, weird partitions, dynamic disk)

What you see: Diskpart sees the disk, Setup lists it but won’t install, or partitions behave strangely.

Why it happens: Leftover metadata can confuse Setup or trigger safety logic. Dynamic disks are also not install targets in the way people wish they were.

Fix: Back up data, then wipe partition tables and metadata from Setup using diskpart (clean), and reinstall.

Cause D: The NVMe is physically not where you think it is

What you see: BIOS doesn’t see the NVMe at all; or it appears intermittently; or it disappears when another device is attached.

Why it happens: M.2 slots can be SATA-only, PCIe-only, or combo; some share lanes with other slots or SATA ports; some boards support NVMe only on specific M.2 slots.

Fix: Use the correct slot, update BIOS, or reconfigure lane-sharing conflicts (often documented in the motherboard manual, which nobody reads until after the outage).

Cause E: Broken or outdated installer media

What you see: Same machine works with one USB stick but not another; odd failures during driver loading.

Why it happens: Corrupt media, outdated ISO, or a tool that built the USB in a weird partition/boot layout.

Fix: Recreate the installer using a known-good method; prefer a current Windows ISO; use a different USB drive.

Hands-on tasks: commands, outputs, and decisions (12+)

These tasks assume you’re either in Windows Setup (Shift+F10 for a command prompt) or booted into a WinPE/repair environment. Some tasks can be run from Linux live media too, but we’ll keep it Windows-centric. The commands below are exactly what I run in production-adjacent situations: short, observable, reversible when possible.

Task 1: Confirm you’re in WinPE and capture the environment

cr0x@server:~$ ver
Microsoft Windows [Version 10.0.22621.1]

What it means: You’re in a Windows environment (WinPE/Setup shell). Version hints whether your media is modern.

Decision: If this is ancient (very old build), consider recreating media before blaming hardware.

Task 2: See if any disks are visible at all

cr0x@server:~$ diskpart
Microsoft DiskPart version 10.0.22621.1

DISKPART> list disk

  Disk ###  Status         Size     Free     Dyn  Gpt
  --------  -------------  -------  -------  ---  ---
  Disk 0    Online         953 GB      0 B        *

What it means: WinPE can see your NVMe (Disk 0) and it’s GPT.

Decision: If list disk shows nothing, you need a controller-mode/driver fix before partitioning matters.

Task 3: If no disks, list drivers currently loaded (hunt for RST/VMD)

cr0x@server:~$ drvload
DrvLoad: Enumerating driver packages...
DrvLoad: No driver package specified.

What it means: You didn’t load anything yet; WinPE is using its built-ins.

Decision: If disks are missing and firmware has VMD/RAID enabled, plan to load the storage driver or change firmware mode.

Task 4: Enumerate PCI storage controllers (quick clue for VMD/RAID)

cr0x@server:~$ wmic path win32_pnpentity where "Name like '%Controller%'" get Name
Name
Microsoft Storage Spaces Controller
Standard NVM Express Controller

What it means: Seeing “Standard NVM Express Controller” is good; that’s the in-box NVMe driver path.

Decision: If instead you see Intel(R) Volume Management Device or an Intel RAID controller without an NVMe controller, your disk may be behind VMD/RST.

Task 5: Use Setup shell to load a vendor driver from USB

First, find your USB drive letter (in WinPE it is often not E: like you expect).

cr0x@server:~$ diskpart
Microsoft DiskPart version 10.0.22621.1

DISKPART> list volume

  Volume ###  Ltr  Label        Fs     Type        Size     Status     Info
  ----------  ---  -----------  -----  ----------  -------  ---------  --------
  Volume 0     C                NTFS   Partition    100 GB  Healthy
  Volume 1     D   UEFI_NTFS     FAT32  Removable     16 GB  Healthy

What it means: Your USB is D:.

Decision: Use that letter to load the driver INF.

cr0x@server:~$ drvload D:\drivers\intel_vmd\f6vmdflpy-x64\iaStorVD.inf
DrvLoad: Successfully loaded D:\drivers\intel_vmd\f6vmdflpy-x64\iaStorVD.inf

What it means: The driver loaded into WinPE.

Decision: Re-run diskpartlist disk. If the disk appears now, you’ve confirmed VMD/RST was the blocker.

Task 6: After driver load, re-check disk visibility

cr0x@server:~$ diskpart
Microsoft DiskPart version 10.0.22621.1

DISKPART> list disk

  Disk ###  Status         Size     Free     Dyn  Gpt
  --------  -------------  -------  -------  ---  ---
  Disk 0    Online         953 GB   953 GB        *

What it means: The NVMe is now visible and empty/unallocated.

Decision: Proceed with partitioning and install. Or, if you’re in a corporate environment, record the driver and firmware setting for standardization.

Task 7: Inspect the disk and existing partitions (spot OEM weirdness)

cr0x@server:~$ diskpart
Microsoft DiskPart version 10.0.22621.1

DISKPART> select disk 0

Disk 0 is now the selected disk.

DISKPART> list partition

  Partition ###  Type              Size     Offset
  -------------  ----------------  -------  -------
  Partition 1    Recovery           990 MB  1024 KB
  Partition 2    System             100 MB  991 MB
  Partition 3    Reserved            16 MB  1091 MB
  Partition 4    Primary           952 GB  1107 MB

What it means: A normal UEFI GPT layout. Nothing alarming.

Decision: If you’re reinstalling, you can delete partitions in Setup UI or wipe with diskpart (next task) for a clean slate.

Task 8: Wipe the disk (the “stop being clever” option)

cr0x@server:~$ diskpart
Microsoft DiskPart version 10.0.22621.1

DISKPART> select disk 0

Disk 0 is now the selected disk.

DISKPART> clean

DiskPart succeeded in cleaning the disk.

What it means: Partition table and metadata are removed. The disk is now blank.

Decision: If you needed data, you already made a mistake earlier. If this is a clean install, this is often the fastest path to sanity.

Task 9: Convert to GPT explicitly (UEFI install best practice)

cr0x@server:~$ diskpart
Microsoft DiskPart version 10.0.22621.1

DISKPART> select disk 0

Disk 0 is now the selected disk.

DISKPART> convert gpt

DiskPart successfully converted the selected disk to GPT format.

What it means: Disk is GPT, suitable for UEFI boot.

Decision: If your installer was booted in Legacy mode, stop and reboot it in UEFI mode instead of forcing MBR. Don’t build a fragile system on purpose.

Task 10: Create partitions manually (when Setup UI is being weird)

cr0x@server:~$ diskpart
Microsoft DiskPart version 10.0.22621.1

DISKPART> select disk 0

Disk 0 is now the selected disk.

DISKPART> create partition efi size=260

DiskPart succeeded in creating the specified partition.

DISKPART> format quick fs=fat32 label="SYSTEM"

  100 percent completed

DiskPart successfully formatted the volume.

DISKPART> create partition msr size=16

DiskPart succeeded in creating the specified partition.

DISKPART> create partition primary

DiskPart succeeded in creating the specified partition.

DISKPART> format quick fs=ntfs label="Windows"

  100 percent completed

DiskPart successfully formatted the volume.

What it means: You now have a sane EFI + MSR + primary layout.

Decision: If Setup was failing due to partition confusion, this often unblocks it. Install to the NTFS “Windows” partition.

Task 11: Verify boot mode clues (UEFI vs Legacy) from WinPE

cr0x@server:~$ reg query HKLM\System\CurrentControlSet\Control /v PEFirmwareType
HKEY_LOCAL_MACHINE\System\CurrentControlSet\Control
    PEFirmwareType    REG_DWORD    0x2

What it means: 0x2 indicates UEFI; 0x1 indicates BIOS/Legacy.

Decision: If you’re in Legacy mode but want GPT/Windows 11, reboot the installer and select the UEFI boot entry for your USB device.

Task 12: Check for BitLocker or encryption blockers (common on corporate laptops)

cr0x@server:~$ manage-bde -status
BitLocker Drive Encryption: Configuration Tool version 10.0.22621
Copyright (C) 2013 Microsoft Corporation. All rights reserved.

Disk volumes that can be protected with BitLocker Drive Encryption:
Volume C: [OS]
[OS Volume]
    Size:                 952.87 GB
    BitLocker Version:    2.0
    Conversion Status:    Fully Encrypted
    Protection Status:    Protection Off
    Lock Status:          Unlocked

What it means: The disk was encrypted at some point; in WinPE you might see it locked on other machines.

Decision: If volumes are locked, you may need the recovery key or a full wipe. Don’t assume “disk missing” when it’s “disk locked.”

Task 13: If Setup installs but won’t boot, repair the EFI boot files

Assign drive letters and rebuild boot files using bcdboot.

cr0x@server:~$ diskpart
Microsoft DiskPart version 10.0.22621.1

DISKPART> list volume

  Volume ###  Ltr  Label        Fs     Type        Size     Status     Info
  ----------  ---  -----------  -----  ----------  -------  ---------  --------
  Volume 0     W   Windows      NTFS   Partition    952 GB  Healthy
  Volume 1     S   SYSTEM       FAT32  Partition    260 MB  Healthy    System
cr0x@server:~$ bcdboot W:\Windows /s S: /f UEFI
Boot files successfully created.

What it means: The EFI System Partition has boot files.

Decision: If the system previously failed to boot with “no boot device,” this often fixes it—assuming firmware boot order is sane.

Task 14: Identify whether the NVMe is showing as a standard NVMe device post-install (sanity check)

Once Windows is installed (or in a recovery command prompt), you can query storage controllers.

cr0x@server:~$ pnputil /enum-devices /class StorageControllers
Microsoft PnP Utility

Class: StorageControllers
Instance ID: PCI\VEN_144D&DEV_A808&SUBSYS_...
Device Description: Standard NVM Express Controller
Status: Started

What it means: You’re not dependent on an exotic storage driver for basic NVMe access.

Decision: For a personal rig, this is the happy path. For corporate images, you may still need RST/VMD for policy reasons—don’t freelance on managed endpoints.

Driver loading: when you actually need it (and when you don’t)

Loading a storage driver in Windows Setup is a power move, but it’s also how people accidentally install a mismatched driver and then wonder why the machine bluescreens later. Your goal is to avoid unnecessary drivers unless your hardware requires them.

When you need to load a driver

  • Your firmware has VMD enabled and the NVMe is behind it.
  • Your system is configured for Intel RST RAID even with a single disk.
  • You are installing onto an actual RAID volume.
  • You have a new platform where the inbox WinPE driver set doesn’t cover the controller.

When you should not load a driver (and should change firmware settings instead)

  • Single NVMe disk, no RAID required, no corporate imaging constraints.
  • You just want the disk to appear. Disabling VMD and using standard NVMe is simpler and usually more reliable.

There’s also a practical reality: the Windows installer UI “Load driver” workflow is fine, but I trust drvload more because it provides a clear success/failure signal and doesn’t hide error messages behind a polite dialog.

Firmware settings that make or break NVMe installs

Most “Windows Setup can’t see my NVMe” cases end with one of these changes. The exact menu names vary by OEM, but the concepts are stable.

1) Disable Intel VMD (or set it to “off” for the NVMe slot)

On many systems you’ll see “VMD Setup Menu” or “Enable VMD Controller.” If the NVMe is mapped under VMD, WinPE might not have the driver. Disable it unless you need enterprise features or are installing a corporate image that expects it.

2) Switch Storage Controller mode: RAID/RST → AHCI

For a single drive, AHCI/NVMe is the least surprising configuration. RAID mode often exists for factory imaging and IT standardization. If you’re not using RAID, don’t cosplay as a storage array.

3) Boot mode: UEFI only, disable CSM/Legacy

UEFI boot is the modern baseline. CSM (Compatibility Support Module) is how you end up with MBR installs on hardware that expects GPT, and then you get to learn about boot repairs. Fun for exactly nobody.

4) Secure Boot and TPM

Secure Boot usually doesn’t hide NVMe devices. But it can block unsigned driver loading in some environments. If you must load a storage driver and it won’t load, temporarily disabling Secure Boot can be a diagnostic step—not a permanent lifestyle choice.

5) NVMe slot configuration and lane sharing

Some boards have multiple M.2 slots where only one is wired to the CPU, others to the chipset, and some share lanes with SATA. If the NVMe disappears when you plug in another drive, this is not paranormal activity; it’s topology.

Joke #2: Lane sharing is the motherboard’s way of saying, “You can have two nice things, but not at the same time.”

Common mistakes: symptom → root cause → fix

1) Symptom: BIOS sees NVMe, Windows Setup sees no disks

Root cause: VMD/RST/RAID mode enabled; missing storage driver in WinPE.

Fix: Disable VMD / switch to AHCI, or load the correct Intel RST/VMD driver via drvload or Setup’s “Load driver”.

2) Symptom: Disk shows in diskpart, but not in the Setup partition list

Root cause: Setup UI filtering, weird partition metadata, or the disk is offline/read-only.

Fix: Use diskpart to check attributes and clear them; delete partitions or clean the disk.

3) Symptom: “Windows cannot be installed to this disk. The selected disk is of the GPT partition style.”

Root cause: Installer booted in Legacy/CSM, but disk is GPT.

Fix: Reboot and choose the UEFI boot entry for the USB installer, then install to GPT.

4) Symptom: “Windows cannot be installed to this disk. The selected disk has an MBR partition table.”

Root cause: Installer booted in UEFI mode; disk is MBR.

Fix: In Setup shell: diskpartcleanconvert gpt, then install.

5) Symptom: Drive appears only after you remove another SSD/HDD

Root cause: Lane sharing disables one port/slot when another is used.

Fix: Move drives to different ports/slots per board topology; sometimes updating BIOS improves compatibility.

6) Symptom: Setup can see the disk, install completes, but system won’t boot

Root cause: Wrong boot mode, missing EFI boot entry, or boot files written to the wrong disk (common with multiple drives attached).

Fix: Disconnect other drives during install; repair with bcdboot; confirm UEFI mode and boot order.

7) Symptom: NVMe disappears after BIOS update or “reset to defaults”

Root cause: Defaults re-enabled VMD/RAID; storage mode changed.

Fix: Re-apply known-good firmware settings; document them before you update BIOS next time.

8) Symptom: “No signed device drivers were found” when loading drivers

Root cause: Wrong driver architecture (x86 vs x64), wrong INF, Secure Boot policy, or you pointed Setup at a folder without the correct INF.

Fix: Use the correct x64 F6 driver package; load the specific .inf; if needed, temporarily relax Secure Boot for diagnosis.

Three corporate mini-stories from the trenches

Mini-story 1: The incident caused by a wrong assumption

They were rolling out a refreshed laptop model to a large set of users. The imaging process was “known good” because it had worked for years: boot WinPE, apply WIM, reboot, done. On the new model, WinPE booted fine, networking worked, and the task sequence started. Then it failed at the “partition disk” step because there were no disks.

The first assumption was the obvious one: “bad batch of SSDs.” Someone even swapped a few drives, because that’s what you do when you’re in a hurry and the failure looks physical. It didn’t help. Then the assumption shifted to “WinPE is broken,” so they rebuilt the boot media. Still no disks.

The actual cause was boring: the vendor shipped the new model with Intel VMD enabled by default, and the org’s WinPE image didn’t include the VMD/RST storage driver. The old models were in AHCI mode, and WinPE’s built-in NVMe support handled them fine. Same process, new assumption, different reality.

Once they loaded the correct driver, disks appeared instantly. The real fix was process: they added a hardware detection step to the imaging pipeline that checked for the VMD controller and refused to proceed without the proper driver loaded. No more silent failure halfway through an install.

Mini-story 2: The optimization that backfired

A different org wanted faster provisioning. Someone had a clever idea: keep RAID mode enabled across the fleet, even for single-disk machines, because it “standardizes the storage stack” and “lets us move to RAID later.” They also liked that it made the BIOS menus look uniform in screenshots.

It worked—until it didn’t. A few months later they updated their Windows installer media to a newer build and trimmed their driver pack to reduce size. The NVMe driver was still present (standard), but the RST/VMD package got removed because it “looked optional” and hadn’t been needed on older systems.

The result was a provisioning slowdown that looked like random failures: some models still exposed NVMe directly, others used VMD, and the simplified driver pack only worked on half the fleet. The helpdesk saw “no disks found,” and the first-line workaround became “switch to AHCI.” That broke the standard, and it also broke existing installs on machines where Windows was installed with RAID drivers enabled.

They eventually reverted the optimization. Not because RAID mode is evil, but because consistency is a contract: if you standardize on an abstraction layer, you must keep the driver story tight from WinPE to full OS. Removing “unused” drivers is only safe when you actually understand what “unused” means across hardware generations.

Mini-story 3: The boring but correct practice that saved the day

A small SRE-adjacent team managed lab machines used for build pipelines. These machines were not glamorous, but they were critical. They had a standard operating procedure for bare-metal rebuilds: unplug all non-target drives, document BIOS settings, install in UEFI mode, verify GPT, then reattach secondary storage.

Someone called it paranoid. It wasn’t. One day a build host died and needed a fast reinstall. The host had two NVMe devices: one for OS, one for build cache. If you install Windows with both attached, Setup has an annoying habit of placing the EFI System Partition on whichever disk it feels like, especially if one is “Disk 0” today and “Disk 1” tomorrow.

They followed the checklist. OS NVMe only. Install. Post-install, they verified the EFI partition was on the correct disk and that bcdboot wasn’t needed. Then they reattached the cache drive. The host returned to service without the kind of boot weirdness that costs you a day later when someone updates firmware and the disk enumeration order changes.

That practice doesn’t win innovation awards. It wins uptime. In production, uptime is the only award that matters.

Checklists / step-by-step plans (do this, not that)

Plan A: Clean personal install on a single NVMe (recommended)

  1. Disconnect other drives (SATA SSDs/HDDs, extra NVMes). Prevent boot files landing on the wrong disk.
  2. Enter BIOS/UEFI:
    • Set boot mode to UEFI (disable CSM/Legacy).
    • Set storage mode to AHCI or NVMe (disable RAID/RST if you’re not using it).
    • Disable VMD if present (or at least for the NVMe slot).
  3. Boot the installer using the UEFI boot entry for the USB drive.
  4. At the disk selection screen:
    • If disk shows: delete partitions (or use diskpart clean) and let Setup create partitions.
    • If disk doesn’t show: go to Plan B (driver) or re-check VMD/RAID.
  5. After install, reattach other drives, then confirm boot order and that Windows boots without the USB.

Plan B: Corporate machine where RAID/VMD must stay enabled

  1. Keep firmware settings as required by policy (don’t “fix” by disabling VMD if the org depends on it).
  2. Have the correct x64 F6 storage driver on the installer USB in a known folder (for example \drivers\storage).
  3. Boot Windows Setup, hit Shift+F10, and load the driver with drvload.
  4. Verify with diskpartlist disk that the NVMe appears.
  5. Proceed with install; post-install verify the controller and driver are correct in Device Manager or via pnputil.

Plan C: Disk appears but install fails (partition/metadata cleanup)

  1. Shift+F10 → diskpartlist disk.
  2. select disk 0 (confirm size matches the NVMe you want).
  3. clean (wipes partition table; destructive).
  4. convert gpt.
  5. Return to Setup UI and install to unallocated space.

Plan D: Install completes but system won’t boot

  1. Confirm you booted and installed in UEFI mode.
  2. Disconnect secondary drives (reduce variables).
  3. In WinPE, assign letters to Windows and EFI partitions.
  4. Run bcdboot to regenerate boot files on the EFI partition.
  5. Check BIOS boot order: Windows Boot Manager should be the first entry.

FAQ

1) Why does Linux live media see the NVMe but Windows Setup doesn’t?

Linux often has broader in-kernel support for storage controllers out of the box, including some VMD/RST-adjacent paths. Windows Setup’s WinPE image may lack the vendor driver. This is a driver/controller-mode mismatch, not a “Windows hates NVMe” situation.

2) If I disable RAID/VMD to install, can I re-enable it later?

Not safely without preparation. Switching storage mode after install can cause boot failures because the required driver stack differs. If you must change modes later, plan it: enable the needed drivers in the OS before flipping firmware settings.

3) Is AHCI required for NVMe?

No. NVMe is its own protocol. People say “AHCI” because BIOS vendors use it as shorthand for “not RAID/RST.” What you want is “standard NVMe exposed directly,” not “NVMe hidden behind RAID/VMD.”

4) I see the disk, but Setup says it can’t install to this partition. What now?

Usually boot mode or partition style mismatch (UEFI/GPT vs Legacy/MBR), or stale metadata. Boot the installer in UEFI mode and use diskpart clean + convert gpt for a clean install.

5) Does Secure Boot prevent NVMe detection?

Generally no. Secure Boot affects what bootloaders and drivers can load. It can complicate loading third-party drivers in WinPE, but it doesn’t “hide” the NVMe device by itself.

6) Why does the NVMe show in BIOS but not in diskpart?

BIOS seeing it means the device exists electrically. Diskpart not seeing it implies the OS environment doesn’t have the right driver path to the controller mode currently configured (VMD/RST is a usual suspect).

7) Should I update BIOS/UEFI to fix this?

If the platform is old or known to have NVMe boot quirks, yes—BIOS updates can improve NVMe enumeration and UEFI boot reliability. But don’t use BIOS updates as your first move. Check VMD/RAID mode first; it’s faster and less risky.

8) Why does Windows Setup put boot partitions on the “wrong” drive?

Setup follows disk enumeration order and sometimes chooses the first suitable EFI System Partition it sees. With multiple drives attached, it can place boot files on a different disk than the one you’re installing Windows onto. The fix is simple: disconnect other drives during install.

9) Do I need vendor NVMe drivers for performance?

Most consumer and enterprise NVMe drives perform well on Microsoft’s standard NVMe driver. Vendor drivers can add management features, but they’re not mandatory for baseline performance. For reliability, fewer moving parts is often better.

10) How do I know whether I’m dealing with VMD specifically?

In BIOS, you’ll see “VMD” toggles or “Intel Volume Management Device.” In WinPE/Windows, device listings may show an Intel VMD controller rather than a standard NVMe controller until the correct driver is loaded.

Next steps that prevent a repeat

Once you’ve got Windows installed and booting, do the operationally mature thing: make the next reinstall easier.

  • Record firmware settings (VMD on/off, RAID/AHCI, UEFI/CSM). A photo works. A ticket note works. Amnesia does not work.
  • Keep a known-good driver stash on your installer USB: storage (RST/VMD), network, and chipset. Label folders clearly.
  • Standardize the install mode: UEFI + GPT, consistently. If a machine can’t do that, it’s telling you it’s time to retire it or isolate it.
  • Disconnect non-target drives during installs. This prevents bootloader placement issues and accidental wipes.
  • After install, verify the storage stack: ensure the system uses the expected controller/driver and that the NVMe shows healthy in OS tools.

The headline takeaway is simple: when Windows Setup can’t see your NVMe, the disk usually isn’t the problem. The translation layer is. Fix the controller mode or supply the driver, then move on with your life.

← Previous
MTU Problems: The Hidden Cause of “Some Sites Don’t Load”
Next →
App Control / WDAC Lite: Practical Allow‑Listing for Normal People

Leave a comment