Library Management System Node Js

0 views
Skip to first unread message

Vennie Fireman

unread,
Aug 3, 2024, 1:49:22 PM8/3/24
to spinarpizo

I'm getting an error 7 "File Not Found" from a Call Library Function Node, despite the fact that I can see via probe that the wire value for the input Path is correct, and I have triple-checked that the DLL file I want to call is at that file system location.

For context, I am running an NI Hypervisor system using LabVIEW 2012 with Real-Time and FPGA modules. The code is running on the real-time side, and I have it set up to build a path to one of multiple different engine model DLLs when the selected engine model changes elsewhere in the code, and to maintain the existing DLL path otherwise.

I have verified the DLL files are at the correct path on the real-time file system, e.g. "c:\ni-rt\startup\data\ExtraSpecial_EngineModelBoosted.dll". I have verified via probes that the path gets rebuilt correctly when the engine model changes.

The system was working correctly at the start of my project, and I am not sure what changed such that it is not working now. I did add support for a new engine model including putting a new DLL in the above-referenced "data" folder, but this code did not need to be updated for that. I have verified that all of the Call Library Function Node configuration options are the same as before I started:

My searches on this topic mostly turned up suggestions about making sure the development environment matches the deployment environment, but that isn't applicable in this case because I am setting up a computer / device that I will then give to the customer, so there is only the one environment.

I do have in the real-time executable build specification the engine model DLLs in the "Always Included" section of the "Source Files" properties, the "data" folder referenced above set as a support directory in the Destinations properties, and those DLLs set to deploy to that support directory, but I'm not sure how much that matters because I am also placing the DLLs directly into that "data" folder in the real-time file directory via FTP using Windows Explorer. Do those settings mean that if I had not manually placed them myself, LabVIEW would do so when I deploy the real-time executable? And since I did do so manually, does it matter that those settings are set that way?

Below is a picture of the relevant section of code, for what it is worth. If anyone has any ideas as to why I'm getting an error 7 "File Not Found", I'd really appreciate your help! I've been banging my head on this one for a while.

The DLL was compiled using Microsoft Visual Studio 6.0 on Windows XP, with I some version of the Visual FORTRAN plug-in (I can get more specific details in a little bit). This is unfortunately the only system we have that will compile the original FORTRAN code of the engine model (with a wrapper written in C).

Visual Studio 6 uses the standard msvcrt.dll which was also included in the Pharlap ETS export list (with the exception of some private exports from the Microsoft runtime). So that part should be fine. With newer Visual Studio versions you run into the problem that each of them uses its own msvcrXX.dll. But you can't just go and install the standard Microsoft redistributable C runtime installer on Pharlap since they depend on private Windows kernel APIs that Microsoft never (or only after the fact) documented and where therefore not available in Pharlap ETS.

But Fortran is of course throwing a huge potential of trouble in here. Those Fortran DLLs depend themselves on other Microsoft Fortran Runtime libraries which may have an entire bunch of potential compatibility problems of their own in respect to being able to load on Pharlap ETS. Do those Fortran libraries have an option to be compiled as self contained similar to what you can do for C DLLs? (Selecting the Multithreading Runtime Support instead of Multithreading DLL Support in the compile options). This will for the C DLL in fact include the C Runtime Library in the DLL (and make it bigger of course) but often circumvent the problem of having to install Visual Studio Version specific runtime libraries in every target system. It's still no guarantee that it would run on Pharlap ETS since Microsoft also likes to make use of the newest and greatest Win32 APIs in those runtime functions and the Pharlap ETS API is really at about the state of Windows NT4/2000 and any newer APIs introduced (or documented) after that are almost never included. Pharlap/Ardence/ZeroInterval/NI, in a few select cases added (stubbed) support for a few later Win32 APIs but they are few and far in between.

Generally you want to check all the DLLs with some Dependency Checker like tool (for instance this) to see what dependencies each has. Also NI has a download page where they provide special tools that you can point at a DLL and it will tell you if it references APIs that the according LabVIEW Realtime Pharlap ETS kernel doesn't support. This tool is different for each LabVIEW version. It may be a good idea to archive this page somewhere else as with the discontinuation of LabVIEW Realtime Pharlap ETS, this page is bound to disappear at some point when someone at NI cleans up the website.

However, the previously existing engine model DLLs depend on DFORRT.DLL, which is already in the same folder as the engine model DLLs on the target system, and neither it nor the older engine model DLLs have any bad calls (and only DFORRT has any stubbed calls). This makes sense as the older engine models used to work. And yet now they do not.

Sounds like you are not using the same version of the Fortran compiler than what was used back then for the older engine model DLLs. It's definitely not something LabVIEW has any way to influence and you need to check at the compiler side.

Unfortunately, none of that explains why the existing engine models from our configuration management system would stop working. I confirmed that the DFORRT.DLL file from our configuration management system is on the real-time side's file system, in the same folder as the engine model DLLs (which is where it always was, if I'm reading our configuration correctly). I even deleted it and re-added it in case it had gotten corrupt or left open by another process or something, although our multiple restarts would have fixed the latter.

I'm not sure about the exact standard locations the Pharlap ETS system will look for dependent libraries when trying to load DLLs, but I'm pretty sure that the directory where your primary DLL is located is in no ways privileged by Pharlap ETS in any way to any other directory on the system. And it won't search the whole disk for a DLL that is also for sure.

So your DFORRT.DLL that you see in that directory may just be sitting there being pretty, and not having any functionality whatsoever. Instead on the system where it worked, someone had probably figured out that this DLL should instead be copied into the C:\Windows\System directory or whatever is the equivalent on Pharlap (I don't have a system handy to check) to make it work!

That was it, Rolf, thank you! The DFORRT.DLL (not case-sensitive, by the way, which was another thing I checked) file needed to be put into the "C:\ni-rt\system" folder on the real-time side, rather than in the same folder as the engine model DLL. That was not correctly documented in our installation / setup process, and the file in that location must have been lost at some point during a system restore.

The new engine model having extra dependencies is something we are actively addressing by paring down its function calls to match the older models, but now that the older engine model DLLs are working again I am confident that we can get the whole system working.

Throughout this workshop, you will have the opportunity to interface with the data using the library management system application we've provided you. This application uses an Angular front end and a Node.js back end. The application is already built and deployed to a GitHub codespace, so you won't need to install anything in your local environment. You will be able to access the application from your browser.

Our library management application leverages the data modeling patterns you'll explore throughout this workshop. As you navigate through the application, you'll witness how these patterns come to life, enhancing both the user experience and the efficiency of the system.

Our library application enables users to explore a rich catalog of books, search for specific titles, delve into book details, and even share their thoughts through comments. Users can also reserve books for future reading. Behind the scenes, MongoDB's data modeling patterns play a crucial role in ensuring the application's flexibility, scalability, and performance.

For library employees, an admin panel offers functionalities like managing book checkouts and returns. As we guide you through the application, you'll discover how the various design patterns we've discussed so far are seamlessly integrated to optimize data storage, retrieval, and overall system performance.

I arrived in 2004 the original 1997 installation of Heritage was still running, though no maintenance routines, upgrades or so had ever been carried out...!). Although this is a postgrad and research library its staffing is the lowest of all UCL libraries, and its funding per student significantly below UCL average.

As you may remember I would have preferred under the special circumstances here at the Eastman (one librarian, one IT person, two p/t library assistants for 1,000 readers) not to have to run my own LMS, and rather join a consortium. [...], UCL's main system, was unsuitable - too
inflexible; [...] which two comparable libraries share would have been a good solution,but the [...] faction got the Vice Provost to disallow us to join that... So we were 'stuck' with the local Heritage set-up, but I don't regret it because, jointly with your engineers and trainers, we have made it a real success, and have in some respects a more modern and efficient system now than UCL's clumsy [...] is.

First of all, your engineers helped us update, upgrade and implement all the modules we wanted - half the ones my predecessor bought in 1997 had never been installed. We have since automated many procedures, such as Acquisitions and Serials. We asked you if NLM records could be made available through QuickCat, and you achieved this within weeks - another enormous benefit for us as it allows us to download quality records with the right classmarks and subject headings. We wanted a self-issue machine, and although writing the SIP connectivity did take longer than planned it works fine now, and my UCL colleagues look at the machine with envy .... We feel we are getting as much out of our LMS as we could wish for. The greatest advantage of Heritage, however, is the responsiveness of the support team and engineers.

c80f0f1006
Reply all
Reply to author
Forward
0 new messages