Oh, no... nothing so momentous!
The CD-ROM as a storage medium already existed at the time. There was a filesystem format (ISO 9660) already in use for data storage on it... a fairly complex standard. I had a SCSI CD-ROM drive on my Mac at home at the time. Knowing about CD-ROM helped me get the job, I'm sure. At the time, CD-ROM hadn't been used for videogame consoles... 3DO was the first.
I didn't want to use the ISO 9660 format for the 3DO... it was designed for data archiving, not for high performance as a "live" filesystem. It was overly complex, and it pretty much requires that you read the entire filesystem directory tree into memory and keep it there at all times... not a great choice for a limited-memory game system built using a small microprocessor core. A single bad scratch in the wrong place could leave most of the files inaccessible and useless.
I designed a different filesystem structure that was better suited to the need. Its big innovation was that it could store two more copies ("avatars") of a file or directory at different places in the disc, and at runtime the o/s would automatically read the one which was closest to the current location of the read head. It would automatically mark an avatar as being damaged if it had trouble reading the data due to a scratch or some dirt on the disc, and would automatically retry the read using a different avatar. This improved both performance and reliability.
A lot of games had only a couple of hundred megabytes of content, which meant that the 3DO disc could store redundant copies of almost all of it, and survive some pretty bad damage... scratches, fingerprints, baby spittle, etc.
From the programmers' point of view it was automatic and invisible... no special effort required. You would build a filesystem image with default options (one avatar per file) on an emulator, and then play the game a few times. The emulator would keep track of which data blocks were being read how many times. An analysis script would then figure out which files were the commonly used ones. You would run the filesystem builder a second time, and it would create a filesystem which used up the entire storage capacity of the disc, scattering multiple avatars of the commonly-used files all over the place... some files might have a dozen avatars, from inner edge to outer. That's the image which was then burned to CD-R and sent off to manufacturing.
A guy named Hedley Davis gave me a suggestion which made it even better... constructing a "startup cache" of exactly those files being read when the game first started, so that it could all be streamed into memory with a minimum number of head seeks. His original idea wasn't directly implementable but I figured out a variation which was (and I neglected to see that he got proper public credit for the original idea when I demonstrated the results... a graceless oversight that I still very much regret).
The device driver worked with an industry-standard ATAPI drive... I already understood SCSI sand there's not a lot of difference at the command level.