Showing posts from November, 2021

Apologies to commenters!

I've just discovered that comments  on the blog have been queuing up for moderation without my realising it. I was expecting notification when new comments were posted but that hasn't been happening. I'm now working my way through the backlog. If you've been waiting for a response, I can only apologise.


Regular readers will remember that I've been  exploring JAX . It's an amazing tool for creating high-performance applications that are written in Python but can run on GPUs and TPUs. The documentation mentions the importance of thinking in JAX . You need to change your mindset to get the most out of the language, and it's not always easy to do that. Learning APL could help APL is still my most productive environment for exploring complex algorithms. I've been using it for  over 50 years. In APL, tensors are first-class objects, and the language handles big data very well indeed. To write good APL you need to learn to think in terms of composing functions that transform whole arrays. That's exactly what you need to do in JAX . I've been using JAX to implement models of spiking neural networks, and I've achieved dramatic speed gains using my local GPU. The techniques I used are based on APL patterns I learned decades ago. Try APL APL is a wonderful programming