Windows Installer Can’t Find Drives: When You Need IRST or VMD

Was this helpful?

Nothing feels more “enterprise” than a brand-new laptop or server that boots a Windows installer flawlessly… and then politely insists you have no storage. You’re staring at a blank disk list, the clock is ticking, and the person who requested this build “needs it for a meeting.”

This is almost never magic. It’s usually IRST (Intel Rapid Storage Technology) and/or Intel VMD (Volume Management Device) hiding your NVMe drives behind a controller Windows Setup can’t talk to without the right driver. Sometimes it’s your BIOS storage mode. Sometimes it’s both. Either way, the fix is straightforward if you diagnose it like an SRE: confirm what layer is missing, prove it with commands, then change one thing at a time.

What’s actually happening when Windows Setup shows no disks

Windows Setup isn’t “missing your SSD.” It’s missing the path to your SSD. Most modern NVMe drives sit behind a controller that may be exposed as:

  • Standard NVMe (Windows has an inbox driver; it usually “just works”).
  • Intel RST / RAID mode (the controller abstracts disks; Windows needs Intel’s driver).
  • Intel VMD (a hardware feature that “owns” NVMe devices so they appear as managed behind VMD; Windows Setup needs the VMD-capable driver).

If your BIOS is configured for RAID or VMD, your NVMe drives are not exposed as plain PCIe NVMe devices. They’re presented through Intel’s stack. That can be desirable in managed fleets (consistency, certain RAID modes, enterprise policies), but it’s hostile to an unprepared installer USB.

So the installer does what it can: it boots WinPE, loads inbox storage drivers, enumerates storage devices, and then shrugs. The “Load driver” button isn’t a suggestion. It’s the trapdoor out of this situation.

Joke #1: Windows Setup is like a bouncer. If your storage driver isn’t on the list, your SSD isn’t getting into the club.

Where people lose time

They treat it as a generic “Windows bug,” and then randomly toggle BIOS settings until something appears. Sometimes that works. Sometimes it burns your existing data. Sometimes it breaks BitLocker recovery requirements later. The professional move is to identify whether you’re dealing with IRST, VMD, or something else and then proceed with intention.

IRST vs VMD: how they differ and why Windows cares

Intel RST (Rapid Storage Technology) in one practical paragraph

IRST is Intel’s storage driver stack for systems configured in RAID mode (even if you’re not using RAID). When “SATA mode” or “Storage mode” is set to RAID, the firmware exposes a controller that expects an Intel driver (often branded as RST). With NVMe drives, IRST can still be involved, especially when the platform wants to manage NVMe behind a RAID layer.

Intel VMD (Volume Management Device) in one practical paragraph

VMD is a feature that sits between the CPU/chipset and NVMe devices, presenting them as managed endpoints behind a VMD controller. This is common on newer Intel platforms, especially corporate laptops and servers. When VMD is enabled, your NVMe SSDs do not enumerate as standard NVMe devices to the OS installer. They show up only if the OS has the correct VMD-aware storage driver.

How to tell which one you’re facing

  • If BIOS mentions VMD explicitly (or “VMD controller”), assume you need the VMD-capable IRST driver.
  • If BIOS says RAID mode and no disks show, assume IRST driver is needed.
  • If BIOS is AHCI and VMD is disabled, Windows almost always sees NVMe without extra drivers (unless it’s a special controller or a broken image).

Two ways out

  1. Load the correct storage driver during installation (best when you must keep RAID/VMD enabled).
  2. Change BIOS storage mode to AHCI / disable VMD (best for simple, single-disk installs—unless the machine is managed and expects RAID/VMD).

In corporate environments, option 2 can be a career-limiting move if it violates standards, breaks remote management expectations, or invalidates your golden image assumptions. Use it deliberately, not as a panic button.

Fast diagnosis playbook (first/second/third)

This is the “stop flailing” sequence. Do it in order. You’re trying to find the bottleneck quickly: firmware mode, missing driver, or media problem.

First: confirm BIOS storage mode and VMD state

  • Check if VMD is enabled for NVMe.
  • Check if storage controller is RAID/RST vs AHCI.
  • Check Secure Boot state only if you plan to load unsigned drivers (usually you shouldn’t).

Decision: If VMD/RAID is enabled, plan to load IRST/VMD driver. Don’t waste time repartitioning; the installer can’t see the device yet.

Second: from WinPE, prove whether any disk is visible at all

  • Use diskpart to see whether Windows sees any physical disks.
  • If zero disks appear, this is almost always a missing storage driver or a platform controller mode mismatch.

Decision: If diskpart sees nothing, don’t blame partition tables. You’re still upstream of that.

Third: load the driver (or switch BIOS mode) and re-check enumeration

  • Load the IRST/VMD “F6” driver from a USB stick.
  • Re-run disk enumeration. Confirm you now see disks in diskpart and in the GUI.

Decision: If the driver loads but disks still don’t appear, you likely have the wrong driver branch (VMD vs non-VMD), wrong architecture, or a USB/media issue.

Facts and history that explain today’s pain

These aren’t trivia for trivia’s sake. They explain why “Windows can’t find drives” still happens in 2026 on premium hardware.

  1. The “F6 driver” ritual is older than NVMe. Windows XP-era installs used F6 to load third-party storage drivers for RAID/SCSI controllers. The muscle memory stuck; the branding did not.
  2. AHCI was a compatibility revolution. Once AHCI became common, Windows gained broad inbox support for SATA controllers, reducing the need for vendor drivers on consumer machines.
  3. NVMe simplified hardware, then VMD complicated visibility again. Standard NVMe is clean, but enterprise manageability features often reintroduce an abstraction layer.
  4. “RAID mode” can be enabled even when you’re not using RAID. OEMs ship systems with RAID/RST enabled for fleet uniformity, not because every laptop is doing RAID0.
  5. VMD is partly about management and hot-plug behavior. On some platforms it supports enterprise behaviors and standardized enumeration paths that OEMs like.
  6. Windows inbox drivers lag behind platform features. Microsoft can’t ship every vendor’s latest controller nuance in-box without testing and support complexity exploding.
  7. UEFI made installs easier; storage modes kept them interesting. UEFI removed a lot of bootloader pain, but platform storage controller modes still gate device discovery.
  8. OEM driver packaging is optimized for running Windows, not installing it. Vendors often ship EXEs meant for a live OS; the installer needs extracted INF/CAT/SYS files.
  9. Corporate imaging practices amplify small mismatches. A single BIOS baseline change (like enabling VMD) can break every prebuilt installer USB in the building.

Practical tasks: commands, outputs, and decisions (12+)

Below are real tasks you can run while troubleshooting. Some are run inside Windows Setup (WinPE), some on an admin workstation, and some on a Linux live environment when Windows is being uncooperative. Each includes: command, what the output means, and the decision to make.

Task 1: In WinPE, verify you’re actually in WinPE and which build you’re running

cr0x@server:~$ wpeutil UpdateBootInfo
Updating boot information...
Success.

What it means: You’re in Windows Preinstallation Environment (WinPE) if wpeutil exists and works.

Decision: Continue with WinPE-native tools. If wpeutil is missing, you’re not in Setup/WinPE, and these steps won’t match your environment.

Task 2: In WinPE, list physical disks (the fastest “do we see anything?” check)

cr0x@server:~$ diskpart
Microsoft DiskPart version 10.0.22621.1

DISKPART> list disk

  Disk ###  Status         Size     Free     Dyn  Gpt
  --------  -------------  -------  -------  ---  ---

What it means: No disks listed. WinPE currently has zero visibility into your storage controller.

Decision: Stop blaming partition tables. Load IRST/VMD driver or change BIOS mode.

Task 3: In WinPE, check if the USB installer is mapped and what drive letters exist

cr0x@server:~$ wmic logicaldisk get deviceid, volumename, filesystem, size
DeviceID  FileSystem  Size        VolumeName
X:        NTFS        6442450944  BOOT
D:        FAT32       3126853632  ESD-USB

What it means: X: is the RAM disk WinPE; D: is likely your installer USB.

Decision: You have a place to stage drivers (another USB can show as E:, etc.). If you don’t see your driver media, fix USB port/media first.

Task 4: In WinPE, inspect loaded storage-related drivers

cr0x@server:~$ drvload /?
Loads a driver offline.

DRVLOAD filename.inf

What it means: drvload is available, which is how you load the extracted IRST/VMD INF in Setup.

Decision: Prepare the driver folder with INF/SYS/CAT, not an EXE.

Task 5: In WinPE, load the IRST/VMD driver from a second USB

cr0x@server:~$ drvload E:\IRST\VMD\f6vmdflpy-x64\iaStorVD.inf
Driver package added successfully.
Published name: oem12.inf

What it means: The driver INF loaded and was accepted by WinPE.

Decision: Immediately re-run diskpart. If disks appear now, proceed with install. If not, you likely loaded the wrong branch (non-VMD vs VMD) or wrong architecture.

Task 6: In WinPE, re-check disk enumeration after driver load

cr0x@server:~$ diskpart
Microsoft DiskPart version 10.0.22621.1

DISKPART> list disk

  Disk ###  Status         Size     Free     Dyn  Gpt
  --------  -------------  -------  -------  ---  ---
  Disk 0    Online          953 GB   953 GB        *

What it means: The SSD is visible now. The problem was driver visibility, not hardware failure.

Decision: Install normally. If this machine previously had data, pause and confirm you’re not about to wipe the wrong drive.

Task 7: In WinPE, identify the disk and confirm it’s the expected model

cr0x@server:~$ wmic diskdrive get model, size, serialnumber
Model                          SerialNumber        Size
SAMSUNG MZVL21T0HCLR-00B00     S6XYZ0ABC123456     1024209543168

What it means: You can see the model/size. This helps avoid installing to the wrong device (especially on systems with multiple drives).

Decision: If the model doesn’t match expectations, stop and verify hardware inventory or BIOS drive mapping.

Task 8: In WinPE, check whether the disk is GPT and whether old partitions exist

cr0x@server:~$ diskpart
Microsoft DiskPart version 10.0.22621.1

DISKPART> select disk 0

Disk 0 is now the selected disk.

DISKPART> list part

  Partition ###  Type              Size     Offset
  -------------  ----------------  -------  -------
  Partition 1    Recovery           980 MB  1024 KB
  Partition 2    System             100 MB   981 MB
  Partition 3    Reserved            16 MB  1081 MB
  Partition 4    Primary            952 GB  1097 MB

What it means: The disk has existing partitions; it’s not “blank.” That’s normal for reimages.

Decision: Decide whether you’re doing an in-place install, a clean install, or preserving a recovery partition. Don’t mindlessly clean unless your process requires it.

Task 9: In WinPE, confirm you can see the driver store and log files

cr0x@server:~$ dir X:\Windows\inf
 Volume in drive X has no label.
 Directory of X:\Windows\inf

10/10/2025  11:03 AM    <DIR>          .
10/10/2025  11:03 AM    <DIR>          ..
09/15/2025  08:21 PM           152,340 setupapi.dev.log

What it means: The SetupAPI log exists; it records driver load attempts and failures.

Decision: If driver loading fails silently in the GUI, you can read logs here to confirm signature/INF parsing issues.

Task 10: In WinPE, inspect SetupAPI log for storage driver binding clues

cr0x@server:~$ type X:\Windows\inf\setupapi.dev.log | findstr /i "iastor vmd fail error"
>>>  [Device Install (Hardware initiated) - PCI\VEN_8086&DEV_9A0B]
>>>  Section start 2026/02/05 12:41:09.123
!    dvi: Failed to install device. Error = 0xE000022F

What it means: The platform has an Intel device (VEN_8086), and driver installation failed with a driver-store/INF mismatch error.

Decision: You probably have the wrong driver package version or wrong INF. Switch to the OEM-provided “F6” folder intended for OS installation, not the full RST app installer.

Task 11: On a Linux live USB, confirm whether VMD is active and whether NVMe is hidden behind it

cr0x@server:~$ lspci -nn | egrep -i "vmd|raid|nvme"
00:0e.0 RAID bus controller [0104]: Intel Corporation Volume Management Device NVMe RAID Controller [8086:9a0b]

What it means: VMD is enabled and presenting itself as a RAID bus controller. Your NVMe may not show as plain nvme0n1.

Decision: For Windows install, plan on VMD-capable IRST driver. Alternatively disable VMD in BIOS (with eyes open about consequences).

Task 12: On Linux, see if NVMe disks are visible as standard NVMe or behind device-mapper/RAID

cr0x@server:~$ lsblk -o NAME,MODEL,SIZE,TYPE
NAME        MODEL                 SIZE TYPE
sda         USB  SanDisk 3.2Gen1  29G  disk
├─sda1                             4G  part
└─sda2                            25G  part

What it means: Only the USB stick is visible; the internal NVMe is not exposed to Linux either (common when VMD is enabled without kernel support or modules).

Decision: This strongly suggests the controller mode is the gating factor, not Windows-specific weirdness. Go to BIOS settings and confirm VMD/RAID configuration.

Task 13: On a Windows admin machine, verify the driver package actually contains the right INF files

cr0x@server:~$ ls -1 IRST/f6vmdflpy-x64
iaStorVD.cat
iaStorVD.inf
iaStorVD.sys
license.txt

What it means: This is the “good” shape for Setup: INF/SYS/CAT in a folder. Not a single EXE.

Decision: Copy this folder to a FAT32/NTFS USB and load iaStorVD.inf in Setup. If your download is an EXE-only bundle, extract it properly or get the correct deployment package.

Task 14: Create a working Windows installer USB (because broken media causes fake storage problems)

cr0x@server:~$ sudo woeusb --device Win11_23H2_English_x64.iso /dev/sdb
WoeUSB-ng : Creating Windows USB stick on /dev/sdb...
Info: Mounting source filesystem...
Info: Copying files...
Info: Installing GRUB...
Done

What it means: Your USB was written successfully by a tool designed for Windows ISOs.

Decision: If the machine still can’t see disks after proper driver load, it’s not because your ISO wasn’t copied correctly.

Task 15: Optional but powerful: inject the IRST/VMD driver into boot.wim so you never “Load driver” again

cr0x@server:~$ sudo mkdir -p /mnt/wim
cr0x@server:~$ sudo wimlib-imagex mount boot.wim 2 /mnt/wim
Mounting WIM image boot.wim (image 2) to directory /mnt/wim.
cr0x@server:~$ sudo wimlib-imagex add-driver /mnt/wim --driver IRST/f6vmdflpy-x64/iaStorVD.inf
Adding driver: IRST/f6vmdflpy-x64/iaStorVD.inf
Successfully added 1 driver package(s).
cr0x@server:~$ sudo wimlib-imagex unmount /mnt/wim --commit
Committing changes to WIM image...
Unmounting WIM image...
Done

What it means: You baked the driver into the WinPE image (index 2 is usually Windows Setup). Now the installer should see VMD-managed disks without manual driver loading.

Decision: Use this for repeat installs at scale. Don’t do it for one-off fixes unless you like carrying artisanal USB sticks for the rest of your life.

Three corporate mini-stories from the trenches

Mini-story 1: The incident caused by a wrong assumption

A mid-sized company rolled out a new batch of developer laptops. Same OEM as last year. Same Windows version. Same imaging USB, lovingly labeled with a fading Sharpie. The techs expected a boring day of “next-next-finish.”

Instead, every single machine hit the same wall: Windows Setup claimed there were no drives. The team assumed the SSDs were dead. A few laptops got RMA paperwork started. Someone even suggested a “bad shipment of NVMe.” That’s when the first real diagnosis happened: one tech booted a Linux live USB and saw an Intel VMD controller in lspci.

The OEM had changed the BIOS baseline. VMD was now enabled by default. The imaging USB worked fine last year because NVMe devices were exposed as standard NVMe. This year, the OS needed the VMD-capable IRST driver during installation. Nothing was “broken”; the environment changed.

They fixed it by extracting the correct F6 VMD driver package and adding it to the installer workflow. The RMA requests were canceled. The wrong assumption wasn’t “these drives are dead,” it was “this year’s platform behaves like last year’s.” In infrastructure, “same vendor” is not the same thing as “same stack.”

Mini-story 2: The optimization that backfired

A different org tried to optimize provisioning time. Their thinking: “If RAID/VMD causes problems, we’ll standardize on AHCI and disable VMD everywhere. One less driver, fewer moving parts.” On paper, it looked clean.

In practice, they had a mixed fleet. Some models had BitLocker enabled during first boot via policy. Others had vendor tooling that expected RAID mode, and a handful relied on preboot diagnostics that assumed the OEM storage configuration. Disabling VMD made Windows Setup happy, but it created a different kind of sadness: post-install compliance checks started failing, and a few devices needed manual recovery steps when BitLocker state didn’t match the expected measurements.

The kicker: a subset of users later got BIOS updates that re-enabled VMD as part of “recommended defaults.” Suddenly, devices that were built in AHCI mode and then flipped back to VMD mode after an update stopped booting cleanly. It wasn’t a data loss apocalypse, but it was an operations tax that never ended.

They eventually rolled back to the boring approach: keep the OEM storage mode defaults, and make the installer adapt by including the right drivers. The “optimization” reduced complexity for the installer, but increased complexity for the lifecycle. Classic local minimum.

Mini-story 3: The boring but correct practice that saved the day

A large enterprise had a simple rule: every new hardware model got a one-page “boot and storage” validation before it entered mass deployment. Not a weeks-long certification program—just a repeatable checklist. BIOS storage mode documented. VMD state documented. A WinPE dry run with diskpart. Driver package archived in a standard location.

It was unglamorous. It also prevented chaos when a new ultrabook model arrived with VMD enabled and a slightly different device ID than previous generations. The desk-side team didn’t need to guess. Their internal note said: “VMD enabled, use iaStorVD.inf from the approved bundle. Confirm Disk 0 appears in diskpart before proceeding.”

When Windows Setup showed no drives (as it predictably did), the techs followed the script. Driver loaded. Disk appeared. Install completed. No escalation. No midnight chat thread. No improvised BIOS roulette.

This is the SRE lesson in a different outfit: most outages are avoided by boring preparation, not heroic troubleshooting.

Common mistakes: symptom → root cause → fix

1) Symptom: “No drives found” but the SSD is brand new

Root cause: VMD enabled; Windows Setup lacks the VMD-capable IRST driver.

Fix: Load the correct F6 VMD driver (often iaStorVD.inf) during Setup, or disable VMD in BIOS if allowed.

2) Symptom: Driver loads successfully, but disks still don’t appear

Root cause: You loaded the wrong driver branch (non-VMD IRST), wrong architecture, or wrong generation package that doesn’t match your device ID.

Fix: Use SetupAPI logs to see the PCI device ID and match it to the INF. Get the OEM-approved driver bundle for that platform. Try the VMD-specific INF.

3) Symptom: Disks appear only when you switch BIOS to AHCI

Root cause: RAID/VMD was enabled and required vendor drivers; AHCI exposes a standard path Windows supports in-box.

Fix: Prefer loading drivers and keeping RAID/VMD if corporate baseline requires it. If you choose AHCI, commit to it and understand future BIOS updates may revert settings.

4) Symptom: After changing BIOS from RAID/VMD to AHCI, existing Windows won’t boot

Root cause: The installed OS was configured with a different storage driver stack; switching modes breaks boot storage enumeration.

Fix: If you must switch, prepare Windows by enabling the appropriate driver before flipping BIOS (or reinstall). In managed environments, avoid toggling modes on already-installed systems unless you have a tested procedure.

5) Symptom: “A media driver your computer needs is missing” during Setup

Root cause: Often USB controller/port issues (especially USB 3.x quirks), but it gets misdiagnosed as storage. Sometimes it is storage, but not always.

Fix: Try a different USB port (prefer USB-A 2.0 if available), a different USB stick, and rebuild the installer media. Then revisit IRST/VMD.

6) Symptom: Disk shows up, but install fails or is painfully slow

Root cause: Wrong or generic driver is functioning but not optimal, or the platform is in a degraded RAID mode.

Fix: Post-install, verify the correct storage driver is installed and the controller is healthy. If RAID is configured, check array state before blaming Windows.

7) Symptom: You can see the disk in diskpart, but Windows installer GUI still shows no install targets

Root cause: The disk is offline, read-only, or has unusual partition metadata confusing Setup.

Fix: In diskpart: attributes disk, online disk, and if policy allows, wipe and convert to GPT cleanly.

8) Symptom: Everything worked last quarter; now it fails on the same model

Root cause: BIOS update changed defaults (VMD enabled, RAID mode toggled), or your installer ISO changed build and removed a driver you relied on.

Fix: Treat BIOS baselines like code. Re-validate after firmware updates. Maintain a known-good driver injection process.

Joke #2: A storage mode toggle is the “have you tried turning it off and on again” of enterprise imaging—effective, but you’ll regret doing it casually.

Checklists / step-by-step plan

Checklist A: One-off install (you just need this machine built today)

  1. Enter BIOS/UEFI settings.
  2. Write down current settings: RAID vs AHCI, VMD enabled/disabled.
  3. Boot Windows installer USB.
  4. At the disk selection screen, press Shift+F10 to open a command prompt.
  5. Run diskpartlist disk.
    • If no disks: you need the driver or BIOS mode change.
    • If disks exist: skip to partition decisions.
  6. Insert USB with extracted IRST/VMD F6 drivers (INF/SYS/CAT).
  7. Run drvload against the correct INF (often iaStorVD.inf for VMD).
  8. Re-run diskpartlist disk to confirm enumeration.
  9. Back in the GUI, click Refresh. Select the correct disk.
  10. Proceed with install. After first boot, confirm Device Manager shows the expected storage controller driver.

Checklist B: Fleet provisioning (you want this to stop happening)

  1. Pick a BIOS baseline per model and lock it via management tooling where possible.
  2. Collect the exact driver package needed for that baseline (VMD vs non-VMD, Windows version compatibility).
  3. Integrate driver injection into your imaging pipeline:
    • Inject into boot.wim so Setup sees disks.
    • Inject into install.wim so the installed OS has the driver on first boot.
  4. Maintain a minimal “break glass” USB containing:
    • F6 driver folders for common models
    • One text file documenting which INF to load for which platform
  5. After any BIOS update campaign, re-run a validation install in a lab.
  6. Write down acceptance tests: WinPE sees disks, installer completes, BitLocker policy compliance passes, reboot cycle passes.

Checklist C: If switching BIOS from RAID/VMD to AHCI is being proposed

  1. Determine whether this is a new install or an existing OS.
  2. If existing OS: do not switch modes casually. Plan a tested migration path or reinstall.
  3. Confirm corporate requirements: RAID expectations, compliance tooling, BitLocker/TPM measurement expectations.
  4. Confirm BIOS update behavior: will firmware updates revert storage defaults?
  5. Make the decision:
    • If you need corporate consistency: keep RAID/VMD, fix the installer with drivers.
    • If you control the full lifecycle and want simplicity: AHCI can be fine, but document and enforce it.

Paraphrased idea (Gene Kranz): “Be tough and competent.” In ops, that means calm diagnostics, not random toggling.

FAQ

1) Why does Windows Setup show “No drives found” on a brand-new NVMe laptop?

Because the NVMe is not exposed as a standard NVMe device. With VMD or RAID/RST mode enabled, the installer needs Intel’s storage driver to see the disk.

2) Should I disable VMD to install Windows?

If it’s your personal machine and you want the simplest path, disabling VMD (and using AHCI) can work well. In corporate fleets, don’t do this unless it’s an approved baseline; it can cause post-install tooling, compliance, or firmware-update surprises.

3) What is an “F6 driver” and why do people still call it that?

It’s the old Windows install method for loading third-party storage drivers, historically triggered by pressing F6 in older Windows versions. Today it’s usually a “Load driver” button and an INF file, but the name stuck.

4) I downloaded Intel RST, but it’s an EXE. Windows Setup can’t use it. Now what?

You need the extracted driver files (INF/SYS/CAT), often packaged as an “F6” driver bundle. Get the version intended for OS installation or extract the bundle properly on another machine.

5) How do I know if I need the VMD-specific driver versus regular IRST?

Look for VMD in BIOS, or identify the controller in logs/device IDs. In practice: if VMD is enabled, use the VMD-capable INF (commonly iaStorVD.inf). If VMD is disabled but RAID/RST is enabled, you may use a different IRST INF (often iaStorAC.inf in some packages).

6) Can I install Windows without loading drivers if I switch to AHCI?

Usually yes, because Windows has inbox AHCI and standard NVMe drivers. But if you later re-enable VMD/RAID, the installed OS may not boot without the proper drivers and configuration.

7) Why does diskpart show the disk, but the Windows installer UI still doesn’t?

Common causes: the disk is offline, read-only, or has partition metadata that Windows Setup refuses (rare but real). Use diskpart to check disk attributes and online state; if policy allows, wipe and convert to GPT.

8) Is this a Windows 11 problem specifically?

No. It’s a storage controller visibility problem. Windows 10 and 11 can both hit it depending on platform defaults and which drivers are present in the installer image.

9) Do AMD systems have the same issue?

They can have “missing storage driver” issues, but IRST/VMD is Intel-specific. On AMD, you’re more likely dealing with RAID mode drivers or vendor-specific controller drivers rather than VMD.

10) What’s the most reliable long-term fix for an enterprise?

Stop relying on manual “Load driver.” Inject the correct storage drivers into your WinPE (boot.wim) and OS image (install.wim), and enforce a BIOS baseline per model.

Conclusion: next steps that actually work

When Windows Setup can’t find drives, assume a missing storage driver or controller mode mismatch until proven otherwise. Don’t waste an hour repartitioning a disk the installer cannot even see.

Do these next:

  1. Check BIOS: is VMD enabled? is storage set to RAID/RST or AHCI?
  2. In WinPE, run diskpartlist disk. No disks means you’re missing the path, not the disk.
  3. Load the correct IRST/VMD F6 driver (INF/SYS/CAT). Re-check enumeration.
  4. If this is repeatable across machines, stop doing it by hand: inject the driver into boot.wim and standardize the BIOS baseline.

You’ll end up with fewer “mystery SSD failures,” fewer reinstall loops, and a lot less BIOS cargo culting. Your future self will be bored, and that’s the goal.

← Previous
Windows: The Security Baseline That Stops 80% of RDP Attacks
Next →
Block USB Storage Without Breaking Keyboards and Mice

Leave a comment