Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Wow. Thanks for posting! I’m curious of your thoughts on other visual programming languages. Specifically LabVIEW.

My first job out of college was at NI where they heavily use LabVIEW. Every engineer in my cohort thought that paradigm was going to take off and over a decade later, it clearly hasn’t. One large factor is cost and proprietary tech, but I’m surprised few outside of the hardware testing world even know such a thing exists.

I now code in Rust and Ruby and don’t yearn for a visual language, but I do wish people who needed to code something up quick could do it without getting knee deep in IDEs, syntax, and terminals.

I’m curious if you’ve got an opinion on that space. What’s holding it back, and if we will ever see a “killer language” for visual programming in the productivity focused space? What do you think would be needed for scratch to fill that role?



Thank you for the feedback!

I made use of LabVIEW around 2002-2005 and I liked it, working on systems to measure the length of fiber optics down to the femtosecond (it's amazing what you can do with differential wave form/phase analysis), which was then used to write custom C++ code to do these measurements in real-time at a millimeter wave interferometer, enabling the ability to do real-time adjustment to the sample phases for any fiber that was being heated/expanding in the sun.

LabVIEW allowed us to make a very quick demonstration of this as a working concept. Which illustrates what I believe is the most powerful thing about visual programming environments. They can excel at demonstrations and full working solutions without including the parts of computation that are social constructions (language syntax, data structure access and limitations, ...).

I am not convinced that the idea of a "killer language" will happen. While there is a through-line of abstraction heaped upon abstraction, I am unconvinced that these first 60 years of computation in society are going to be visibly recognizable another 60 years from now.

OR!

Linus Torvalds will invent a language that takes over everything, based on how git has consumed almost the entire space of source code management systems (a sociological phenomena/opinionated work flow process) and the success of GNU/Linux more generally.


Version control and diffing (programmatically or mentally) is a major stumbling block. The only graphical language I know of that doesn't suffer from this issue is PLC ladder logic, where the visual representation is forced and it's easy to programmatically show the differences between two versions of a given program, or mentally know exactly what given logic will look like. Pretty much everything else is miserable to compare code in, and this is especially painful when you're initially learning and the examples you're referencing are unhinged LabVIEW spaghetti with no real way to make it any more pleasant to read.


Just some pretext, I am quite experienced in LabView.

One of the major issues with visual languages is not being able to find the right chunk. Yes there is a search feature, but if you don't know the right word to search you are doomed. In my time as a undergrad research assistant I was pumping out complex LabView code and VI's to control whole experiments. But I still would run into problems where I couldnt for the life of me find the block I needed. The most visceral of these was converting something like a double to single or vice versa. I went hunting for the correct block to do the conversion, which would take barely a line of code to do in C++ (the language I knew at the time), and it wasn't in the 'convert' subset of blocks.. I had this vast code that worked, but I couldn't get the last part of the data path completed because this missing stupid block that converts... Took me like an hour to eventually find it after trying everything including Google and NI's documents...

So stuff like that would be the main reason I think they never took off. If you have to hunt and peck trying to find the right box, instead of innately knowing the syntax to do something like converting. Well it just isnt practical, it's easier to lookup in a reference text for the language than to hunt in NI's closed garden crazy house.

That all said and done, LabView had some of the best Connectivity to lab hardware. Over a decade ago you could connect 2-5 devices from 1985-2000 through GPIB, and into your code assuming you had VI's (sometimes you have to make them yourself). It's hard to state how amazing that was, being Able to automate the control and data acquisition of hardware made during the infancy of the internet for research. It also made you quite valuable in your PI's eyes. So unlike most people I know from that era of my STEM career, I was one of the ones that said 'yeah I hate LabView .. . But man is it good at what it does." ... From conversation with grad students these days, it sounds like the libraries in python caught up on that front.


One of the major issues with visual languages is not being able to find the right chunk. Yes there is a search feature, but if you don't know the right word to search you are doomed

how is that different from serching the right function or library in java or python or any other language?

either you know what to search for or there is good documentation or you have to ask somewhere. i don't see how visual languages fare any worse here. except maybe that the visual chunks don't help as much in memorizing their name so you end up searching more often to find it again.

instead of innately knowing the syntax to do something

any syntax of functionality needs to be learned somehow.


It sounds like the discoverability of the LabView “syntax” leaves much to be desired. Perhaps because it’s a proprietary tool used by lots of non programmers, the Google results seem to be really lacking. Much harder to find a top Stack Overflow answer. And the official documentation seems poorly structured.


I get the feeling that graphical programming is the tooling problem.

The fundamental issue is that diff on textual code has no equivalent in a visual paradigm that doesn't have side effects - one example is that color often has meaning in visual systems, and thus is hard to use for showing differences.

I'd love to be proven wrong around this though.


I actually spent 3 months and a few $k developing an automotive relay driver, pedal control and dual CAN shield for NI myRIO for firmware development of a small hobby car, electric go-kart, e-bike etc.

The hardware worked, but the usability of the standard blocks .. for some reason it was a disaster compared to other visual design tools like MATLAB Simulink.


What’s NI?



Ahh I thought it was Native Instruments - music production is a lot like graphical programming in some sense: you perform a bunch of mathematical operations on sound data by turning knobs, dragging waveforms and drag-and—dropping snippets


Native Instruments Reaktor looks inspired by LabView.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: