Windows
Minimum
- Version: 10/11, older Windows versions may work, but are no longer tested.
- CPU: Intel/AMD, 64 bits
- GPU: Basic Intel graphics cards or Nvidia (e.g. recent GeForce/Quadro/NVS series) graphics cards; AMD graphics cards may work.
- Memory: DDR4 memory, 16 GB of RAM, OpendTect itself needs at least 2 GB RAM. Therefore, 16 GB will almost certainly be the absolute minimum.
- Storage: Hard Disk
Recommended
- Version: 10/11
- CPU: Intel/AMD processor with 64 bit support, 3+ GHz multi-core.
Note that OpendTect uses all processors if necessary. The more cores and speed, the better. OpendTect will automatically use multiple threads in many situations. This depends on the type of attribute, display, etc. We put a lot of effort to get time-consuming tasks multi-threaded. - GPU: Nvidia (e.g. recent main-stream up to high-end GeForce series) graphics cards.
Quadro or NVS series cards could give the bit extra you want. In doubt, buy the best GeForce card you can find. When buying a laptop make sure that it has a Nvidia chipset. - Memory: DDR4 or DDR5 memory, on the safe side don't go for less than 32 GB.
Buy as much memory that you can afford and fits in the system. The big clients for example use nothing less than 512 GB. - Storage: SSD is best, other good options are Hard Disk and Network Drive.
This is usually under-valued, but it's often the crucial performance component. SSD disks will give a tremendous boost in performance; essentially, data on SSD disks loads almost as fast as pre-loaded, in-memory data. Performance could be miserable if data needed to stream through (relatively) slow disks and/or networks.
For Machine Learning
- Version: 10/11
- CPU: Intel, 64 bits for when using Python environment Intel™ Math Kernel - MKL for Machine Learning using CPU only. AMD, 64 bits should be fine when using the Python Environment with CUDA 10.1 for Machine Learning on the GPU.
Ideally you want the system to be expendable to 4 GPUs. The CPU will need to support all GPUs. Important to look for is how many PCIe lanes the CPU supports and how many PCIe lanes are needed for the system's number of GPUs and M.2 NVMe SSDs. We recommend to get a CPU with at least 8 cores, 16 threads and 40 PCIe lanes. - GPU: Nvidia, GeForce or Quadro series.
The GPU needs to be fast enough and able to fit the model and data batch in memory. When in doubt choose the one with more memory. Other things to look for is the number of CUDA cores, tensor cores and GB memory bandwidth per second. We recommend the following cards:- Turing architecture cards (CUDA 10 and later)
- Nvidia GeForce RTX 2080 Ti with 11 GB DDR6 memory and 4352 CUDA Cores
- Nvidia Quadro RTX 6000 with 24 GB DDR6 memory and 4608 CUDA Cores
- Nvidia Quadro RTX 8000 with 48 GB DDR6 memory and 4608 CUDA Cores
- Ampere architecture cards (CUDA 11.1 and later)
- Nvidia GeForce RTX 3080 Ti with 12 GB DDR6 memory and 10240 CUDA Cores
- Nvidia GeForce RTX 3090 with 24 GB DDR6 memory and 10496 CUDA Cores
- Nvidia A40 with 48 GB DDR6 memory and 10752 CUDA Cores
- Turing architecture cards (CUDA 10 and later)
- Memory: DDR4 or DDR5 memory, on the safe side don't go for less than 32 GB.
Buy as much memory that you can afford and fits in the system. - Storage: The best choice is M.2 NVMe SSD that is big enough for the data.
The advantage of M.2 NMVe SSD is that it is plugged into the motherboard and is super fast. Other options are SATA SSD, Hard Disk and Network Drive. Performance could be miserable if data needed to stream through (relatively) slow disks and/or networks.
Please note that:
- For best performance OpenGL drivers should be up-to-date. For Machine Learning on GPU we provide a Python package with CUDA 11.3. Please see this table on the Nvidia CUDA Toolkit documentation page for the minimum compatible driver version.
- The CUDA 10 Python environment is now obsolete and will no longer receive security updates. Users are encouraged to replace it with CUDA 11. Alternatively, you may decide for the CPU-only Python environment.
- 4K/8K screens are not fully supported yet. This depends on the scaling factor. We are working on a fix.
Please see the FAQ Visualization for a possible workaround. - Windows needs to be updated with the latest updates from Microsoft.