Archive for May, 2016
I occasionally write articles over at The University of Sheffield’s Research Software Engineering blog. This is a site I set up with Paul Richmond as part of our EPSRC Research Software Engineering Fellowships.
I recently helped a user of Maple get started with Sheffield’s HPC system and started writing up my notes as a series of blog posts. The first one is at http://rse.shef.ac.uk/blog/HPC-Maple-1/.
I’ve just delivered a session called ‘R awareness’ to a group of IT staff at University of Manchester. The audience was a combination of desktop support, applications support and research software engineers and initial feedback indicates that it was well received.
The focus of the session was not the language itself but the software infrastructure that surrounds it. Multiple versions of R, packages, R Studio, Jupyter notebook, Microsoft R Open, SageMathCloud and the way that various applications such as Mathematica, Maple and Visual Studio interact with R.
I chose to deliver the material in the same way that The Code Cafe is delivered – self directed material where I act as facilitator. This seemed to work really well and there was a lot of conversation and interaction with the audience that I find is missing when doing a more traditional presentation.
Course material is at https://github.com/mikecroucher/R_awareness
I learned about entropy as part of my undergraduate Physics education but it turns out that the concept of entropy turns up in many fields including linguistics, themodynamics, information theory, chemistry and artificial intelligence.
As part of Sheffield’s Open Data Science Initiative, computer scientist, Neil Lawrence, has teamed up with linguist, Dagmar Divjak, to organise a cross-faculty discussion meeting on the subject of entropy.
For more details on the day’s events, and to register, see http://opendsi.cc/ed2016/program
I wasted a little time producing the above logo for the event using Mathematica.
Here’s the source code:-
(*consider column one pixel at a time. Invert the pixel if a random number is below some threshold*) flipbit[col_, prob_] := Module[{result, temp, x}, result = col; Do[ If[RandomReal[] <= prob, If[result[[x]] == 1, result[[x]] = 0, result[[x]] = 1]; ] , {x, 1, Length[col]} ]; Return[result] ] text = "Entropy"; image = Rasterize[Text[Style[text, White, Italic, 190]], Background -> Black]; imageData = ImageData[Binarize[image]]; const = 1/Dimensions[imageData][[2]]*0.42; (*Apply flipbit to all columns. Increase probability of flipping as you move along the x-axis*) logo = Transpose[ MapIndexed[flipbit[#1, const*#2[[1]]] &, Transpose[imageData]]]; Image[logo]
Finally, I found this quote about entropy that I quite like:
You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, no one really knows what entropy really is, so in a debate you will always have the advantage.
John von Neumann to Claude Shannon a name for his new uncertainty function. Source: Wikiquotes