you should learn about clojure. not only it provides nice concurrency semantics (not actors though, more granular stuff), it's also functional (in a pragmatic way, like erlang) and immutable to the core.
While we're on Clojure and the GPU, let me shamelessly promote my upcoming books (I'm the author of that old tutorial and the state of GPU dev in Clojure progressed a lot since then):
Deep Learning for Programmers: An Interactive Tutorial with CUDA, OpenCL, DNNL, Java, and Clojure
Both books progress really well, and you can subscribe now to read the drafts and get the complete books when ready. There's no middle man and 100% of proceeds goes into funding the development of open source Clojure HPC/GPU/ML/DL libraries.
tutorial: https://www.braveclojure.com/introduction/
on gpu: https://neanderthal.uncomplicate.org/articles/tutorial_openc...