- Parallel Computing Toolbox
- GPU Coder
- Image Processing Toolbox
- Deep Learning Toolbox
- Statistics and Machine Learning Toolbox
- Computer Vision System Toolbox
- Signal Processing Toolbox
- Communications Toolbox
- Phased Array System Toolbox
- Text Analytics Toolbox
- Reinforcement Learning Toolbox
- アクティビティ フィードにアップデートが表示されます。
Can I use MATLAB with an NVIDIA GPU on macOS 10.14 Mojave and newer?
221 ビュー (過去 30 日間)
MathWorks Support Team 2019 年 1 月 30 日
Can I use MATLAB with an NVIDIA GPU on macOS 10.14 Mojave and newer?
MathWorks Support Team 2022 年 9 月 2 日
編集済み: MathWorks Support Team 2021 年 3 月 24 日
MATLAB requires that an NVIDIA-supplied graphics driver be installed on your Mac in order to take full advantage of an NVIDIA GPU. NVIDIA has not released an Apple-approved graphics driver for macOS Mojave. For more information, please see this official statement from NVIDIA on NVIDIA's developer forums.
The impact on MATLAB is as follows:
You can use MATLAB with an NVIDIA GPU on macOS Mojave and newer, however, graphics performance is degraded when compared to running MATLAB on previous releases of macOS.
NVIDIA-specific functionality such as CUDA is not available which means GPU Arrays, provided by Parallel Computing Toolbox and used by many products, will not work.
The following products have features that make use of CUDA functionality and these features will be impacted by the lack of an NVIDIA-supplied graphics driver:
Aleksander Tyczynski 2019 年 8 月 14 日
Would downgrading to macOS 10.13 mean that I could make full use of the NVIDIA GPU on my mac? Use it for the Deep Learning Toolbox?
Colin Fraser 2019 年 8 月 14 日
Assuming that NVIDIA was able to release Apple-approved drivers for 10.13, you should be able to use it for Mac. Just make sure the release is supported by the OS.
Jason Ross 2019 年 8 月 14 日
Yes, the GPU will still work with the CUDA driver on MacOS X 10.13 for now. There are some things to keep in mind with respect to this support for future versions of MATLAB, though:
- The generations of GPUs supported on Mac OSX are Pascal and Kepler. There is no support for Volta or Turing cards as of this writing. nVidia will drop support for Kepler and Pascal cards at some point in the future. There's no hard date for this to happen at this point.
- MATLAB has a dependency on the CUDA SDK and Toolkit, which in turn have dependencies on system compilers. At some point it is likely that the toolkit version and compilers will continue to advance, and there may not be an nVidia driver that supports that toolkit release, or that works with a current compiler.
- Apple will at some point stop supporting OSX 10.13 in terms of security upgrades. We publish a road map for what release of MATLAB supports what release of OSX here. For MacOSX 10.13, the last supported release for MATLAB is R2020a.
madhan ravi 2019 年 8 月 14 日
Thank You , Jason Ross!!
Walter Roberson 2019 年 8 月 24 日
The discussion from "metacollin" at https://forums.developer.nvidia.com/t/when-will-the-nvidia-web-drivers-be-released-for-macos-mojave-10-14/65895 is interesting. The claim made there is that Mojave no longer uses OpenGL itself and that Apple will not approve any drivers that do not have Metal support, which NVIDIA does not have as yet.
As Metal is a proprietary API it is not obvious that it is financially worthwhile to Nvidia to write such drivers.
編集済み: Matthew Fitzgerald 2019 年 9 月 14 日
Sorry to reopen this can of worms but is Metal support on the development roadmap for Matlab?
Or would it be possible to develop a "translator" or something similar to be able to use Metal GPU, or is is just too much work for an individual or small team?
Apple developer documentation describes support for parallelisation, matrix operations etc. and it looks like they should be able to be driven in a metal context.
The "translator" is the existing opengl framework from Apple, which they are saying that they will stop supporting soon and which they will certainly not improve. I would expect by two, at most three OS releases from now that MacOS will pretty much not function with opengl.
Apple considers such a translator to be too much work for them, so I would not expect Mathworks to be able to handle it.
I guess either Apple and Nvidia will work out their differences, or Apple and AMD will develop Metal to be a real alternative to CUDA.
In the meantime, does anyone know if it's possible to use something like PlaidML for these hardware/software (Apple/Metal, PlaidML/Matlab) combinations?
Metal is a graphics protocol, not a gpu interface.
In theory Apple could certify a set of gpu drivers for cuda that were distinct from the graphics drivers. I do not know what either Nvidia or Apple are thinking of at this point.
Based on past history, I can speculate, without any inside knowledge at all (so I could be wrong, widely so)
Apple seems willing to to have Nvidia say "fine, we won't bother porting to Apple then!". It has been 6 years since Apple put an Nvidia into anything other than the Mac Pro, so except perhaps on the Mac Pro side, their direct revenue doesn't depend much on Nvidia.
Apple probably has more to lose from the game industry depending on opengl, and several major high profile games companies are working on metal ports and (I gather) getting performance better than DirectX, so they can expect to keep some of the games market, the high performance end.
One reason Apple can afford to tell/let Nvidia take a hike is that apple has AMD to rely on. The old play one company off against the other trick.
But really Apple hates being dependent on one company because that gives the company too much leverage. Apple's solution to this is to go in-house, to build its own graphics and gpu. Indeed it has already been working on that for years https://www.ft.com/content/8c778fae-18df-11e7-a53d-df09f373be87 and I find 2015 articles about this. They have already put their own gpu into some of their phones.
Apple has also been working on replacing the x64 with in-house cpu https://www.engadget.com/2018-04-03-apple-macbook-laptop-chips.html with possible Mac out next year. If I recall correctly, definitions for the new CPU have already been found inside the os currently in beta, Catalina (which, by the way, ends 32 bit support)
If I were Mathworks I would probably think hard about holding off on putting effort into Metal until more was known about the new CPUs, because if the new CPUs are not machine code compatible with x64 then it is not obvious that Mathworks will want to bother: it would be a big effort for a platform estimated by some parties to be roughly 15% of their market.
Oh yes, if Apple goes in-house for gpu (already known to be well underway) then there is no certainty that they will be AMD or Nvidia compatible, and more likely that they will not be, OpenCL at most. This is a reason why it would be risky for Mathworks to spend much effort on AMD gpu for Apple systems.
I can talk about these things because all I know is what is known to the public: I have not discussed this with Mathworks or Nvidia or Apple or AMD.
Michael Melnychuk 2019 年 11 月 29 日
On Nvidia's site there is a CUDA driver released in May 2019 (after the above comments). I'm wondering if you know if this will work with with Matlab? Maybe this will help someone.
Walter Roberson 2019 年 11 月 29 日
Unfortunately the supported OS for CUDA 418.163 is 10.13 (High Sierra). I do not know for sure whether Mojave will reject it, but I am certain that Catalina will.
Christian Kennedy 2019 年 12 月 9 日
The take-away on this is that the intersection of GPU support in Matlab and GPU support post Mac OS 10.13 is the null set. Nvida provided no web drivers for 10.14 and there's no rational reason to expect them to do so for 10.15; meanwhile there's no support for AMD GPUs within Matlab, and it seems equally unlikely that it will evolve.
Compiler sensitivity has already bitten me in the ass on multiple occasions under Linux, so it's not clear that there's a solution there. In short, if you want to have meaningful GPU support under Matlab, it's looking like it's going to take a freaking Windows box to provide it.
It's at dark moments like this that Julia running in a Jupyter notebook seems an almost rational solution to life's problems.
Walter Roberson 2019 年 12 月 9 日
LeChat 2020 年 2 月 15 日
I agree with @christian Kennedy. I feel more and more the push toward getting into Julia (OpenCL support being one very decisive feature for me, especially with the divorce between Nvidia and Apple). I believe Matlab should really evolve toward OpenCL if it wants to survive (in the GPU computing work and on Mac).
Walter Roberson 2020 年 2 月 16 日
Apple does not support OpenCL either, and will not in future.
Apparently OpenCL leaves enough parts optional, and which do in fact differ between manufacturers and models, such that Mathworks would not be able to provide a single OpenCL implementation... at least not an efficient one.
LeChat 2020 年 2 月 17 日
Does this mean that GPU computing on Matlab will die on Mac? Is there any compatibility with Metal planned?
At least please try to leave the future Matlab versions compatible with MacOSX High Sierra (10.13.6), so that we still have CUDA drivers for our Nvidia GPU...
Walter Roberson 2020 年 2 月 17 日
Does this mean that GPU computing on Matlab will die on Mac?
I have been emphasizing to Mathworks that this is something they need to talk about. They have acknowledged reading my concerns, but I do not have a response from them on the topic.
Alejandro Robinson Cortes 2020 年 8 月 25 日
Is there any solution to using GPU computing on Matlab with a MAC?
This issue has been in multiple forum questions for months. Is there any solution to this? I am buying a new computer, and seriously considering to drop MATLAB and/or MAC just for this reason. At this rate, I am inclined to drop both. Please come up with a solution for this.
Jason Ross 2020 年 8 月 25 日
The current solution is that if you want to do CUDA computing of any sort (which includes MATLAB), you need to do it on Linux or Windows. nVidia continues to provide both new hardware and software for these platforms, and MATLAB continues to work with them. We are dependent upon Apple and nVidia to provide that support, and it's been non-existant since 10.13, as I commented about nearly a year ago. There has been no new information from Apple, and no new information from nVidia, despite both vendors developing and delivering major new hardware and software offerings to the market.
I'm sorry that I don't have a better answer at this time, but that is the current state of affairs.
Ted Wong 2020 年 9 月 22 日
Is there a way to switch from GPU to CPU? I'm ok the code takes longer to run.
Well, in the time since a year ago, there was news from NVIDIA, in November, that they will not be making any further MacOS drivers. http://www.cgchannel.com/2019/11/nvidia-drops-macos-support-for-cuda/
Apple moved third party drives down one ring in security (to reduce the ability of drivers to affect the security of other processes), and apparently now requires that third party drivers be included with each different application, instead of being able to install one driver for use with all applications. That would have required that each different application include the NVIDIA drivers (and driver updates would have to be through Apple App Store for any product purchased through App Store).
That would have been quite a burden for developers, unless NVIDIA and Apple had been able to come to an agreement for Apple to bundle NVIDIA drivers... which Apple would not have much inclination to do unless NVIDIA paid them a bunch of money. Reminder: along with Apple's new ARM based CPUs, Apple also has its own custom GPUs, so Apple now sees NVIDIA as a competitor...
Does this mean that GPU computing on Matlab will die on Mac?
NVIDIA GPU computing for MATLAB is already gone on Mac; it is not present in R2020b.
I do not have any information about whether Mathworks is working on support through AMD cards -- but considering Apple is moving to their own GPUs, it would not really make sense for Mathworks to pursue AMD GPU support for the sake of keeping the Mac market. My reading has also suggested that IBM hardware is where the second biggest deep learning research is, so from a research perspective, IBM support might have higher priority than AMD support.
編集済み: M J 2020 年 11 月 12 日
Sorry for the basic question, but is there any way I can train a neural net using a GPU on my Mac (10.15) at all? It sems like there are no options left and I'm kind of confused. Best.
No, there are no GPU options for Mac starting with Mojave. Nvidia gave up and has left the Apple market. (Apple was not cooperating with Nvidia, and was declining to approve all new driver versions Nvidia produced; there was a bunch of chatter alleging that top Apple people had ordered the company to not approve the drivers.)
As of today the new ARM based Mac was released . Mathworks has indicated that they are working on patch for support in R2020b using Rosetta, with native support for the release after. However GPU support is not expected.
Thank you for you answer, appreciate it !! Wow that's disappointing...
Unfortunately the cost of adding support for a different kind of GPU is not small. It is not enough to add OpenCL, as the current support makes extensive use of a vendor-supplied high performance library of linear algebra and similar routines (such as fft).
The reading I did earlier this year suggested that AMD is considerably behind in market share for Deep Learning, and that the major competitor to NVidia is IBM -- so if the goal were to be to support high performance Deep Learning kinds of techniques specifically rather than just general GPU computing, then IBM might be a wiser move. On the other hand, the IBM boards do not seem to be common in the mass market, so for general purpose work, AMD would seem a better choice... but then there is the factor that with Apples new M1 architecture, Apple systems will not be either IBM or AMD...
編集済み: M J 2020 年 12 月 6 日
Very interesting! Thank you for the info. So in any case, I think I will get a desktop/workstation to overcome this problem. Do you think it would be safe to go with something that has, say, a Geforce RTX 2060 (compute capability = 7.5), assuming I intend to work with the trainNetwork function in the long term? Would it be a wise move for the long term (next 5 years)? Thanks again.
Walter Roberson 2020 年 12 月 6 日
My personal assumption would be that 5 years from now, you would be handing an RTX 2060 down to a young relative or neighbour who has an interest in "retro" gaming (early 2020's), with you either having gotten out of the deep learning field yourself, or else having upgraded to something newer / faster / buzz-word-ier.
5 years ago was Maxwell architecture, GeForce 9xx timeframe. Not a terrible architecture by any means, but if you had one in hand now, you would be dithering over upgrading it now or waiting for the next NVIDIA release hoping for a price drop on the RTX 2xxx series.
編集済み: M J 2020 年 12 月 6 日
as long as it will still be compatible with the deep learning toolbox at least, I'm okay with that. But good point though:-) Thanks a lot !
ok as long as Statistics and Machine toolbox is covered
Walter Roberson 2021 年 5 月 31 日
Deep Learning on GPU, and GPU use in Statistics and Machine Learning, will not be supported on Mac any time soon.
I have no information about whether Mathworks is working on GPU for M1; every time I ask through private channels, I get silence.
その他の回答 (3 件)
SALIOU Fall 2021 年 2 月 8 日
the Matlab app cannot be istalled in my Macbook air. how can i do?
mine is not air
victor chen 2021 年 8 月 1 日
where I can get the newest version GPU PROCESSOR IN COMPUTER, like APPLE PRO?
OR best for you recommanded !
many thanks indeed
Unfortunately, MATLAB for MacOS has already dropped Nvidia support. MacOS Catalina is the last MacOS that had the drivers and Pascal was the newest supported architecture.
But to answer the question:
For the Mac Pro, the limit is the Gen 2 Mac Pro (2009) with the NVIDIA GT 120, CUDA 1.1. You would have ot use it with a quite old version of MATLAB, somewhere around R2013-ish. This was the only NVIDIA card that Apple itself supported for any Mac Pro, as far as I can tell.
Jason Ross 2021 年 8 月 2 日
編集済み: Jason Ross 2021 年 8 月 2 日
As Walter says, there are no modern Mac systems that support CUDA or GPU processing using CUDA. To use the latest GPUs (at this point, Ampere cards like GeForce 30XX or A100) you need to run Windows or Linux.
Walter Roberson 2021 年 8 月 2 日
You may be wondering about getting an eGPU for Mac Pro. As far as I can determine at the moment, the eGPU recommended by Apple never supported NVIDIA. I find two Thunderbolt 3 eGPU manufacturers that do support Nvidia, but you cannot get MacOS CUDA drivers for anything newer than Pascal architecture on Catalina, and Mathworks has dropped support for GPUs on MacOS (because Nvidia has dropped support.)
Felix-A. Lebel 2022 年 3 月 20 日
Considering how performant apple made system on a chip architecture (M1) has become, is mathworks considering the possibility of using the new possibilities of this hardware to the benefit of its users? It is a shame that we cannot take full advantage of the hardware due to dispute between Nvidia and apple. Politics should not get in the way of science.
Walter Roberson 2022 年 3 月 20 日
Mathworks considered it, and decided not to proceed for several years (if ever.)
Mathworks is not in the business of writing high performance numeric libraries such as Eigenvalue and QR and fft. It relies on third-party libraries. Apple has not created suitable libraries. Apple does not have the kind of tool chains that Nvidia has to create high performance mathematics. The needs of graphics systems for display are not the same as the needs for science and engineering.
Yu Cheng Huang 2022 年 5 月 30 日
Who cares? MATLAB?
It's time to turn to Python.
Walter Roberson 2022 年 5 月 30 日
I read a posting from a company that was trying to do some higher performance computing on the Apple M1 GPUs. They wrote that the documentation from Apple about how to achieve performance was very weak, and that they tried a number of approaches but were not able to get nearly the rated performance. Apple did not cooperate with them.
This is very different than Nvidia, which puts a lot of effort into making high performance computing accessible to developers.
If Apple does not provide the ecosystem and does not provide enough information for developers to create ecosystems themselves, then the task becomes rather difficult.
Find more on AI for Wireless in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!Start Hunting!
Select a Web Site
Choose a web site to get translated content where available and see local events and offers. Based on your location, we recommend that you select: .
You can also select a web site from the following list:
How to Get Best Site Performance
Select the China site (in Chinese or English) for best site performance. Other MathWorks country sites are not optimized for visits from your location.
- América Latina (Español)
- Canada (English)
- United States (English)
- Belgium (English)
- Denmark (English)
- Deutschland (Deutsch)
- España (Español)
- Finland (English)
- France (Français)
- Ireland (English)
- Italia (Italiano)
- Luxembourg (English)
- Netherlands (English)
- Norway (English)
- Österreich (Deutsch)
- Portugal (English)
- Sweden (English)
- United Kingdom (English)
- Australia (English)
- India (English)
- New Zealand (English)
- 日本Japanese (日本語)
- 한국Korean (한국어)