![Use an AMD GPU for your Mac to accelerate Deeplearning in Keras | by Daniel Deutsch | Towards Data Science Use an AMD GPU for your Mac to accelerate Deeplearning in Keras | by Daniel Deutsch | Towards Data Science](https://miro.medium.com/max/1400/0*rolAhtBQQDGAnjGa.png)
Use an AMD GPU for your Mac to accelerate Deeplearning in Keras | by Daniel Deutsch | Towards Data Science
My GAN deep neural network that is learning how to write handwritten digits, running on AMD GPU with W7.
Use an AMD GPU for your Mac to accelerate Deeplearning in Keras | by Daniel Deutsch | Towards Data Science
![Machine learning on macOs using Keras -> Tensorflow (1.15.0) -> nGraph -> PlaidML -> AMD GPU - DEV Community Machine learning on macOs using Keras -> Tensorflow (1.15.0) -> nGraph -> PlaidML -> AMD GPU - DEV Community](https://res.cloudinary.com/practicaldev/image/fetch/s--1uexk-Y9--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://github.com/NervanaSystems/ngraph/blob/master/doc/sphinx/source/graphics/nGraph_main.png%3Fraw%3Dtrue)
Machine learning on macOs using Keras -> Tensorflow (1.15.0) -> nGraph -> PlaidML -> AMD GPU - DEV Community
Is machine learning in Python best done with Nvidia based GPUs or can AMD GPUs also be used just as well in terms of features, compatibility and performance? - Quora
![Evaluating PlaidML and GPU Support for Deep Learning on a Windows 10 Notebook | by franky | DataDrivenInvestor Evaluating PlaidML and GPU Support for Deep Learning on a Windows 10 Notebook | by franky | DataDrivenInvestor](https://miro.medium.com/max/909/1*_aYP5ONS-bbbtHzoX9svaQ.png)