Archive.zip

0 views
Skip to first unread message

Riley Dyen

unread,
Aug 3, 2024, 3:54:39 PM8/3/24
to gibseysweatep

Hello and thanks by you re reply panic mode.
I'm not one robot integrator and I'm just one user of KUKA robot for stone manufacturing.
I also don't know how to put the robot making the save of archive.zip on USB pen.
I already plug in one USB pen and the computer doesn't recognized it, what can I do in order to solve this problem? computer recognize the USB PEN, step by step.
Tomorrow morning I'll send new photos of the errors that come by and yes it's the archive.zip generated by the robot him self and I stored it in another computer just because I don't know when I need it

Good morning.
As I said yesterday night let's try to solve the KUKA robot with your help.
First of all I will attach the archive.zip to see that the archive is from this robot.
Other thing, can i simply copy paste the files to the archive.zip because when I look at them I can see that I have some files as the $config.dat that is OK in one archive.zip rather than other archive.zip?

Good morning,
I allready made restoration and everything else.
im a lonely person because I dont have any person to talk about this and in portuguese is very dificult to find anybody that knows what Im trying to say.
I have all the CD that camme with the robot and I already putted the KR C, V5.2.14 Build 73 on CD drive and instaled it but one person warned me that if I introduced another CD
the robot assume the basic restoration and the program that its on robot will be deleted.
I cant restore the archive.zip and the robot computer guive me one message that the fiile is corrupted, I tryed several times write the file from several computers, USB pens and diskets to another diskets and in all of them the same error so it cant restore the archive.zip.
What im doing wrong?

However, before doing that -- what is wrong with the robot? You've explained that your Archive.zip file is corrupted, but why are you trying to restore it? Does the robot have a serious problem? Be aware, if you use the install CDs, the robot will be erased -- any programs you have will be lost, permanently, and the robot will be set back to a factory-new condition.

Send the program that it's installed on ROBOT PC.
Will be erased from the robot's PC if I introduce all the CD's to reinstal everything?
Will it be on the same place, on display key in the position 6, actionplus?
What have I done was one simply shut down the robot nothing else but when I turned it ON the problems emerged

When the spark-submit options don't work, Data Flow has the option of providing a ZIP archive (archive.zip) along with your application for bundling third-party dependencies. The ZIP archive can be created using a Docker-based tool. The archive.zip is installed on all Spark nodes before running the application. If you construct the archive.zip correctly, the Python libraries are added to the runtime, and the JAR files are added to the Spark classpath. The libraries added are isolated to one Run. That means they don't interfere with other concurrent Runs or later Runs. Only one archive can be provided per Run.

Anything in the archive must be compatible with the Data Flow runtime. For example, Data Flow runs on Oracle Linux using particular versions of Java and Python. Binary code compiled for other OSs, or JAR files compiled for other Java versions, might cause the Run to fail. Data Flow provides tools to help you build archives with compatible software. However, these archives are ordinary Zip files, so you're free to create them any way you want. If you use your own tools, you're responsible for ensuring compatibility.

Dependency archives, similarly to your Spark applications, are loaded to Data Flow. Your Data Flow Application definition contains a link to this archive, which can be overridden at runtime. When you run your Application, the archive is downloaded and installed before the Spark job runs. The archive is private to the Run. This means, for example, that you can run concurrently two different instances of the same Application, with different dependencies, but without any conflicts. Dependencies don't persist between Runs, so there aren't any problems with conflicting versions for other Spark applications that you might run.

The Data Flow Dependency Packager uses Apache Maven to download dependency JAR files. If you have JAR files that can't be downloaded from public sources, place them in a local directory beneath where you build the package. Any JAR files in any subdirectory where you build the package are included in the archive.

When the Data Flow application runs, the static content is available on any node under the directory where you chose to place it. For example, if you added files under python/lib/ in the archive, they're available in the /opt/dataflow/python/lib/ directory on any node.

Dependency archives are ordinary ZIP files. Advanced users might choose to build archives with their own tools rather than using the Data Flow Dependency Packager. A correctly constructed dependency archive has this general outline:

c80f0f1006
Reply all
Reply to author
Forward
0 new messages