For those of you who weren't able to catch the live stream be sure to watch our interview with ARM Fellow and all around GPU expert Jem Davies. In our Hangout Jem talked about how he got involved with GPUs at ARM (spoiler alert: his boss told him to go out and buy a GPU company). He also shared his thoughts on Mali's unique architecture and how it may evolve over time. We discussed the future of mobile gaming, solving memory bandwidth challenges, and the potential for ARM scaling its GPU architectures from wearables all the way up to supercomputers.

I reference it a few times, but you should also check out Ryan Smith's deep dive into ARM's Midgard GPU Architecture. ARM is the second mobile-focused GPU vendor to allow us unfettered access to the details of their GPUs and we're very thankful for it.

I'd like to thank Jem once again for taking the time to chat with me, and if you haven't already seen them I'd suggest watching some of our other Hangouts with ARM:

Comments Locked


View All Comments

  • mmrezaie - Wednesday, July 9, 2014 - link

    I wish they could see intel driver model financially feasible. I don't like the close model that most ARM vendors are taking. ARM itself is not better either. I meant Linux ecosystem of course.
  • TETRONG - Wednesday, July 9, 2014 - link

    I wonder does ARM have any interest in pursuing the CNN(ComputationalNeuralNetwork) approach to processing? Even in an exotic research sense-maybe as a co-processor which could be optionally incorporated into some of their designs. Seems well suited to tasks coming up in the future..inference, language, computer vision, navigation, etc..

    I remember reading books about this and it sounded so promising in terms of unique capabilities, as compared with let's say the Von Neumann/Harvard paradigm.
    Seems to of been abandoned to a large degree despite the performance possibilities.
    Has it just gone dark for military reasons?

    Would really love to see this technology resurrected with modern fabrication techniques.
  • TETRONG - Wednesday, July 9, 2014 - link

    Sorry, this is what I'm referring to
  • BMNify - Thursday, July 10, 2014 - link

    they already are ina round about way ,Steve Furber now a professor of computer engineering at the University of Manchester and head of the University's ICL processor of computer engineering and leader of the SpiNNaker computing architecture project at Manchester,UK uni, a principal designer of the legendary BBC Micro, is already using arm cores to simulate the brain
    see for many years now.
  • TETRONG - Thursday, July 10, 2014 - link

    Wowwowwow, thank you..
    Sort of difficult to track information on this subject because most of the companies doing this research in the early naughties seem to of folded and when you mention "neural network" most people assume you're talking about the traditional type instead of an actual new computing architecture based on the insights of Leon Chua.
    Looks like a really good contemporary resource for me to investigate further-thank you again!

Log in

Don't have an account? Sign up now