How buggy is LabView these days? Are there less buggy alternatives?

I used LabView software pretty intensively for interfacing with hardware, about 15 years ago. I gave it up because it was so buggy, and National Instruments was so aggressively pushing out new versions with poor quality and introducing new bugs. Now I’m working on a project for which it would have been a great tool if it weren’t for the bugginess. Since my experience is old, I wonder, is this still true?

If they’re still like they were, are there other roughly comparably capable environments that are less buggy? I see DASYLab advertised a lot but it is also a National Instruments product and they seem to position it as a less versatile, more simplistic sibling to LabView.

I don’t especially like the “graphical programming language” but can work with it. I did also like C# for various things but it seemed like a lot of extra work thinking so much about the Windows environment.

Thank you for your thoughts!

LabVIEW is still very much the standard software control language/environment in instrument control and data acquisition in testing. Despite the bugginess that you note, I don’t know another system that is comparable in the breadth of use. DATAQ is the only other one I see in common use (mostly I see it on Dewetrons) but I believe that the supported base of instruments is limited, whereas LabVIEW is a general system (if proprietary) that can interface with a large variety of third party instruments and data acquisition hardware.

There are a few Python packages for data acquisition including PyDAQm interface which is specifically for National Instruments hardware and allows you to use Python as a control language; I have not used it but if I were looking to avoid proprietary software I would start with that because I’ve come to using Python/NumPy/SciPy/Matplotlib for all of the data processing/filtering/munging/visualization that I used to use Matlab or Mathematica to handle, but I don’t know how well it works at running a pseudo-real time control system.

I don’t know about C# but after writing RTOS firmware in C/C++ I’m more than happy to have an abstraction layer that lets me graph out the logic of what I want the controller to do in the code, even if it is not totally optimized. I’m not a good enough C for firmware programmer to write really optimized code, anyway, nor do I have the time or patience to eek out that spare fraction of a millisecond to make something work just so. There are some things, like cleaning septic tanks, mudding and sanding drywall, and writing really efficient firmware, that it just pays to get someone else to do for you even if they charge a premium for it.

Stranger

Still using Excel or Python, because I haven’t found anything else that combines enough flexibility, power, and ease of use. But LabView only become possible in 2009 when it added parallel processes, so it can now actually do useful stuff that it couldn’t do ‘15 years’ ago. Since I personally don’t need pictures of valves and partly-filled tanks, 15-year-old LabView is of less use to me than 15-year-old MS Access – which had better scripting, a good interface tool, good persistent storage, and cost less anyway.

I automatically discount anything built with Java – inevitably slow/bloated with an arcane interface having the great attraction that it’s an improvement on unix shell scripting – and most process control and test measurement software has a badly drawn interface controlling what’s really just a ladder diagram.

Bah. Humbug.

Anyway. No. Haven’t used Labview recently. Come back and tell us what you decide on, and how it works out.

Yeah, this is the weirdest thing I learned when I learned a bit of PLC programming on the Rockwell platform. I couldn’t believe they were using ladder diagrams as the programmer’s model. I worked in the early and mid 1970s with control panels that did their logic in electromechanical relays. They were a pain in the a**. You couldn’t have too many logical dependencies without overloading contact currents especially with the inductive nature of relay coils drawing arcs on contacts breaking. The logic would go all funny when contacts would get dirty. And having to think in terms of those rungs, and where to physically place the relays on the sub panel to minimize wiring pain, was awful. The frustration of the limitations ladder diagrams put on thinking, expression, and readability was just more of that whole platform. I still can’t quite believe we use that now as a way of expressing ideas, a “programming language” if you can call it that. And when I look at the people around me using it, I can’t imagine they ever did it with electromechanical relays. I’m just sort of baffled…