LabVIEW Programming Workshop, Intro to Graphical Coding - Interest Check

LabVIEW programming for makers interest check… I have gotten requests, and I think that there are those around DMS who need simply the basics of how to manipulate the interface alongside a discussion of the sorts of cool things we can do at DMS with the copy of LabVIEW DMS owns running on the big PC in the Electronics room (last I looked prior to recent equipment bench changes).

Stated simply: a skill with LabVIEW will let you hook a computer up to anything to be measured or controlled electronically, and within a few minutes crank out a little custom GUI to control and see data from your new ‘virtual instrument’.

If we did something like this, I would think the format should be something like a 1-2 hour two part class. Attendees arrive with laptop with eval installed, I program a sample piece of code on projector. Then programming time: I’ll suggest a few examples to pick from, and I can hang around and get everyone writing, executing, debugging their first LabVIEW program.

LabVIEW is quite analogous to C code… it is a full blown programming language. It has by far the best selection of already operational drivers for any sort of device capable of computer control/communication, such drivers written by LV users across the globe. As for the code, it looks like schematics for a circuit, requiring no syntax or compiling to run the code.

I was trained on LabVIEW as employee of its makers (NI) in 1997, and I remember feeling quite sketchy about it initially. But after writing my first program, once I realized how quickly and easily I could make something happen in this language, I have not elected to use a textual language since, only used the text stuff when I had to. In my estimation, if you were to compare development speed of LabVEW vs C or java or whatever, I would hazard to guess that LabVIEW development offers 3:1 maybe even up to 10:1 time improvement for a good LabVIEW programmer vs. a good traditional language programmer. Its really that much faster. So its for sure the shiznit for custom internal stuff, but when distributing to your mass of customers, better start negotiating schedule with your software team for some c or java dev right now!

So how interesting is such a entry programming workshop to membership at DMS?

What sorts of programming example suggestions do you guys have?

3 Likes

I and several others are interested. Heck I’d show up just to take notes and learn what its capabilities are.

Interests would include controlling external equipment and measuring response of amps and filters including phase shift.

1 Like

Well how about this: we can perhaps put together a list of guys interested, and hopefully we can exceed the min hon limit for the benefit of electronics (and myself, I know, sounds selfish till you see my bills). After thinking about things a bit more, I think a few short sessions would be best. I can plan single hour classes, but I’ll remain available for 2 hours as folks will likely want to get their programs functional and it will just take time and patience.

So the first class would simply be a look at applications pertinent to DMS and makers while each student installs a copy of eval or student version LabVIEW and drivers on their laptop. You see, a class of me droning on about LabVIEW would likely have minimal impact, other than maximal sedative effect.

But a class where each student writes programs is a whole different matter. And a class where each student is able to toggle an external LED and perhaps read back the state of a digital line teaches not only some LabVIEW coding, but illustrates its usefulness interfacing to the outside world. As long as each student has access to laptop along with ADMIN rights to the machine, we can show how to take over your serial port and/or parallel port to move data in and out or simply work with a digital line state. (Newer PCs would probably need to get a USB serial/parallel interface…see ebay ad at bottom). But this is a way to get a very cheap digital I/O card so to speak. I have written more than one I2C interface using these free ports on computers.

So if we can rally the troops, get a list of enthusiasts who can commit their time, one windows laptop with at least a couple gig free, the funds to get adapter as necessary (link below), and resistors and LEDs to see the port lines toggle (from mass of parts in lab), breakout cables for the ports, breadboard. If we have a set of folks interested in a 2-4 part weekly series 1-2 hours each, every participant could leave with the basic understanding of how to do some simple loops and switches while controlling external digital lines from PC.

We could go further and interface to arduino and pi, control some lab instruments, etc. We could even take control of some IOT stuff, as this is something I have yet to try but was thinking about some simple wifi control systems… could be interesting.

I have had client companies want to dump results into excel and word files, even make results graph in these tools. Ive had to log data to sql databases, even images. I have had to process audio data, even decode audible signaling. I have controlled 7 axis robotic systems with LabVIEW, acquired strain and position data from bridge support during earthquake, measured currents induced in metal from lightning and magnetic events.

Simply stated, LabVIEW is the ideal language to do the sorts of things an electronics buff would want to do with PC. I would say that we should gather a list of interested parties for the programming and digital lines series of 2-4 classes. Further applications classes could be fun from there, but I think lets take it from the top, and judge interest once we near the end of that introductory set.

I have a counter offer for you. Put the class on the calendar and request honorarium. If we don’t get enough people to sign up, I will pay you the $50 for your honorarium myself… A basic introduction to get the paradigm in my head would be worth $50 to me…

What I really want to see is how someone puts together a complete application where they retrieve data from an instrument, store it, then analyze and graph it. The actual app doesn’t need to be complex, I just want to see how someone goes through the entire process in the LabView paradigm.

I can read/write programs in over fifty languages, but those skills don’t translate to the graphical programming that Labview seems to use.

4 Likes

They do, it’s just an abstruse way to write a program. You’ll see the mapping eventually and it will annoy you probably as much as it annoys me!

3 Likes

To each his//her own. LabVIEW didn’t sit well with me initially either. But there I was, employee of NI, taking the class as a hardware engineer surrounded by software engineers who were better equipped to understand what was going on. For me, two weeks of this class, followed by 40 hours per week for 3 months, surrounded by same software engineers programming LabVIEW all day on this massive piece of code used for testing NI hardware. Trial by fire, but I figured out enough of it to create a switch matrix board and write a test suite for the DIO product line of the time.

And I think that is what it might take to develop a comfort level such that you could connect to any instrument, control whatever, develop custom solutions, etc. Statistically 1 in ~200 pieces of code in the world is LabVIEW. Its tremendously popular in environments where newfangled interface is the name of the game. So IMO its worth it. But likely to develop comfort you will need some comprehensive basics class with programming example opportunities all day long, followed immediately by several programming projects, basically an immersion in the software. After that, you will be good at writing spaghetti code (a bad LabVIEW programmer). But a bad LabVIEW programmer can make a lot of cool stuff happen. I was one for nearly a decade, and I proudly cranked out plenty of bad code, some of which is still being used to this day.

So I can help however you guys want. We can do the class, but understand that what I wrote earlier is a reasonable starting point. I can also buddy up to a single person and help them, as I have done so with quite a few folks through the years. Its just like anything else: you teach by example, learn by doing.

2 Likes

I’ll probably show up if the classes work out with my schedule.

I spent 12 years programming in LabWindows CVI and have now switched over to LabVIEW. I’m reasonably proficient, but once in a while still bang my head against the dreaded “paradigm shift”.

I can help out with the basics and maybe pick up a few things along the way.

Todd

2 Likes

I think LabView is most directly analogous to the ‘schematic view’ that Altera’s Quartus II used to have. It’s there to get people who don’t like code over the hump, but it’s not the objectively optimal way to do it.

Thanks for putting in the time and effort to get people acquainted with the resource!

2 Likes

Rather then creating any custom interfaces, I suggest we use the existing three Rigol 1054Z oscilliscopes. They have NI LV drivers available, https://www.rigolna.com/products/digital-oscilloscopes/ds1000Z/ds1054z/

And using LV with an actual instrument is closer to why I think these classes are important to the space. In particular we have a Virtual Bench, which has some custom software that allows you to use it like normal bench equipment, all be it, with a really poor interface. But if people knew Lab View, this virtual bench could be made to shine for certain types of projects. Ideally, we (Art and I) want these classes to get people up to speed with the skills they need for that.

I have purchased a copy of LV Home Edition, since all of my bench gear at home comes with LV drivers and I can potentially use them in ways I could not with out LV programming (at least not as easily).

1 Like

Another possibility for real world interaction from LabVIEW is Linx. It’s been updated a lot since I looked at it, so I’m not totally up to speed, but the concept is that you install a program on an arduino. Then call LabVIEW VIs to send/receive data from the resources on the arduino - i2c, SPI, a-to-d,s etc. The benefit being you get a nice display and analysis capability in LabVIEW that you would have to code from scratch to get by other means. It’s NOT an uber high performance interface ( most < $30 chunks of hardware aren’t… ) but it could be useful.

More information: https://www.labviewmakerhub.com/doku.php?id=libraries:linx:start

2 Likes

http://www.ni.com/academic/students/learn-labview/

1 Like

And there we go with the topic of LabVIEW and NI stuff luring all the smart people out from behind their monitors…

All right then, how about we hit all of those topics, and each person does an assignment and gives presentation to class about how they pulled it off? I know of some real brain teasers if you all are the clever/crafty ones…

But to get to this point, realize that we would need to start from basic programming constructs, data types, execution & debug options, file types and organization, etc. So I can fast track everyone through the drudgery of how to create, index, and use nested loops to cycle through your 2D array of 32 bit floats, but I’d like to see that you guys can whip out some simple programming examples using those same constructs you are already familiar with from the other languages. Once I’m satisfied that the class knows how to properly escape their while loop, and either knows where XOR is or can find it, then we can pull out a list of exciting interface and control examples. In fact, I want a camera person to capture that class, and the demonstrations as well.

And if we develop this sort of interest in LabVIEW as an art, of which it really is IMO, I also want to get the LabVIEW arduino plug-in. It allows the user to write their arduino code in LabVIEW, then load it down into the device. In fact, should we do this class, one of the programming projects will be to render the LabVIEW logo on one of the arduino units that had color display.

So that would make the arduino plug-in something we need ($99 regular price). In fact, I don’t like spending money on software, but this is the one plugin I would recommend purchasing for DMS. I can get extended evals and student versions of most things, especially given its for non-profit with similar gearing as educational institutions. But this plug in is something I think everyone would be interested in using at DMS. In fact, it may be a clever way to roll out basics because the set of functions you can use is much reduced. The palettes containing the functions for the block diagrams are much simplified, as they contain only the most basic core structures and operators used for programming. Familiarizing with these palettes and menus in the most daunting part of learning LabVIEW. This plug-in has the perhaps unintended effect of simplifying things tremendously since I suppose deriving coefficients for chebychev filter was not at top of list of necessary arduino core functions. Aledyne is the name of the Arduino LabVIEW compiler, Lynx will compile for Raspberry Pi and BeagleBone, and can communicate with LabVIEW on PC.

OK, enough rambling. I will read up on setting up a class and get on the schedule.

I may pony up for pizza should it look like we may have nice turnout. But I must warn you: Pizza is for those who came prepared with LabVIEW and device drivers installed on their laptop already (I can help with student license setup as long as you are using solely for your education, no commercial LV usage…there will be huge ‘sturdent version’ across your running code). So I say come prepared to program (Labview/drivers is 1-2 hour install, so please be prepared) and eat the official food of this particular labview programmer.

Oh yeah, what day of the week works for 2 hour class, likely recurring for several weeks? Time of day?

Also, I would like a camera operator to film this so I can have a copy of the raw footage.

One small clarification: Lynx just installs a client program on the arduino / target SBC. The purpose is to use it as cheap IO.

The Aledyne LabVIEW Arduino compiler actually makes code that runs stand alone on an arduino. I’ve read about it, but not tried it. Very interested in doing so.

I’ll also second the comment that finding the way around the palettes of LabVIEW functions is the single hardest part of learning LabVIEW. It is all graphical. Pure pattern recognition. And no zoom function until the next release ( and there will be much rejoicing finally… ) Anyway, that part just takes practice. There are search functions, but you have to remember what to search for. Again - have to know the palettes. That I think is the biggest stumbling block. The learning curve flattens out a bit once you use it enough to remember where to look.

Sounds like a good plan for the classes.

1 Like

Day / time: Tuesday, Wednesday, Thursday, 1900ish would work best for me. Weekends are always busy.

1 Like

So I will shoot for Thursdays from 7-9 PM. I’ll set up for three weeks to start out, and see how things go.

Simply stated, you have to do some programming to get over the paradigm shift. So lets do a little bit of it. I think you will find that it is quite easy and fast to make things happen. As example, I think I could write a program to make a front panel light flash and have it running is about 15 seconds.

Is everyone cool with bringing a laptop with LabVIEW and Device Drivers from www.ni.com PREINSTAL:LED on their computer? For this class to be useful to folks, it should be about you guys programming, less about me talking.

Also, I am trying to set up the event in the calendar system. I the event entry system is not allowing me to select a date for entry of the event. Looks like a technical glitch. Whose cage tor rattle for something like this?

1 Like

The class must be 240 hours out for honorariums, and you need a W9. If either of those are false it will not let you submit. For any other issues, as @AlexRhodes

It would be nice if it had really been an actual website problem had held me back enough to query talk about it. Truth be told, shortly after writing that, I discovered that the big hold up was that I needed to click in a different portion of the date/time selection box tI get it to respond. Genius on my part, huh? Regardless, thanks for prompt response…I’m working on getting it all filled properly to hopefully lock it in…

So I’ll set up first 2 hour class of an N part series, with N determined by interest level. The first class will be all about basic constructs and running the code. By the end of this class you will have written 5 or more little programs, so you should be comfortable running code and know where to find basic constructs and locate info about them.

It will be subsequent class session where we can start integrating to outside world, as I want to ensure that all everyone has the basic idea what time it is when it comes to basic operation of interface first.

So I’ve had the pleasure of introducing LV to several tech geek inclined individuals through the years. One of my cooler training accomplishments was a signal integrity lab tech at chip company. I had implemented a general purpose signal integrity measurement system that could sweep supply rail voltage and DUT chip temperature control through user defined range of settings. I provided a method for user definable DUT chip stimulus that would also trigger scope (or other instrument) which would in turn create a result matrix of size M * N fillable with user definable measurement from a range of possible instruments. Examples in matrix: scope screen shots, scope measured quantities like amplitude, rise time, etc. And you could use a spectrum analyzer, voltmeter, really any other sort of instrument you wanted for the Z axis measurement. And finally, if selected result was numeric, I provided option for 3d color plot output jpg file for visual effect.

I had written some pretty cool pieces of software, but this one was especially cool since the primary premise of signal integrity as a science was to ensure that signal quality was intact when run at max clock rate, and the following parameters are swept within their allowable range: supply voltage, silicon process variability (driver and receiver), PCB glass dielectric range, trace physical dimension ranges, temperature, operational switching noise. So as long as all 0s are 0s and 1s are 1s at every possible combination of these parameters at every conceivable value for every single signal on the design in question, that design is said to have good signal integrity. Prior to my coding this, I had seen extremely overpriced SI experts painstakingly had measure signals one by one, capturing waveforms manually and overlaying them for a nice infinite persistence style sweep of measurements.

Suffice it to say that labview talent was needed to be able to manage and/or operate this signal integrity signaling characterizer. As is often the case, this code was intended to run in the LabVIEW editing environment so as to preserve the maximum capability for the lab tech to alter at will without hindrance of compilation to exe., So when I took on a differing employment opportunity chasing almighty $, I had to train my replacement who could operate this LabVIEW automated SI characterization system. From what I could see, the other SI engineers were not adept enough in the lab to handle this beast. Instead, the lab technician, 2 year technical school degree, DeVry I think, looked like a winner for this one. He was a true lab tech, and made prototypes and fixtures that took a bit longer than I was expecting, but would be solid and crosschecked…worked every time (very rare in EE industry, usually the first pass goes back for rework at least a million times).

So I worked with this lab tech for the remaining time on LabVIEW. And he seemed to absorb it, because when I probed, he answered correctly, every single time. But upon my exit, I never knew for sure weather or not he had control of that situation or not. He looked good when I explained, but as I have learned, some folks get good at looking like they are getting it…product of years of pretending to pay attention in school I guess.

Well, I got my answer some 10 years later when social media relinked us. He was at the same company, and in fact, he had evidently rocketed past all those SI engineers with their four year degrees who never went into the lab. Looked to me like he was running the show, and I would bet that having that SI characterizer alongside a copy of labview put him in the driver seat when it came to correlating simulation to measurement. You see, the simulation-only fellows can put on a great show and talk a good talk. But the lab tech can perform that same simulation, then run it in the lab, and if he becomes skilled enough to learn the secrets to proper correlation, then his simulation results are backed up with measurement. I’d bank on his simulation predictions any day over the promises of the SI engineer who never leaves his office to venture forth into the lab.

So why the long story? This is why: to set expectation, I would not set the expectation that you should be able to match or outperform the LabVIEW dude teaching the class as a result of taking it. I have been working with electronics and software as my indulgent obsession for over 30 years, LabVIEW for nearly 20. On the other hand, I would take this class quite seriously and learn LabVIEW and learn it well. You see, the marriage of your existing software skills to the ability to incorporate directly with electronics in an environment that is many times quicker to program and see your results than what you are used to… well, that marriage spells you as an incredible value for any sort of makery type project you could possibly work on.

And that segways into an even more important topic, your value. As a hardworking, creative DMS electronics and/or software enthusiast prior to LabVIEW class, you have incredible value. And even more so after LV class. With teamwork, I think we can each tap a nice $lice of our collective values. But this is a topic for another day…

I have not forgotten. Class will get scheduled. I am working out all the specifics. Please bear with me. Thx…

2 Likes