...
You will need to know where (the path) to locate your data in Mediaflux (the --dest argument of the command) and where to upload from (the last positional argument)
Example 1 - parallel upload with checksum check
Upload data with four worker threads and turn on checksums for upload integrity checking (recommended). As the location of the config files is not specified, the client will look for it in the .Arcitecta directory of your home directory.
Code Block |
---|
unimelb-mf-upload --csum-check --nb-workers 4 --dest /projects/proj-myproject-1128.1.59/12Jan2018 /data/projects/punim0058 |
Example 2 - using a configuration file
Upload data with one worker thread and specify explicitly where the configuration file is.
Code Block |
---|
unimelb-mf-upload --mf.config /Users/nebk/.Arcitecta/mflux.cfg --dest /projects/proj-myproject-1128.1.59/12Jan2018 /data/projects/punim0058 |
The Configuration File might look like this:
Code Block |
---|
host=mediaflux.researchsoftware.unimelb.edu.au
port=443
transport=https
token=phooP1Angohb2ooyahbiLiuwa6ahjuoKooViedaifooPhiqu1ookahXae7keichael4Shae2ael8ietit2phawucai0Aighifu6olah9OquahDei2aevae3keich8ain1OoLa4O |
Checksums
Checksums (a unique number computed from the contents of a file) are an important data integrity mechanism. The Mediaflux server computes a checksum for each file it receives. The upload client can compute checksums from the source data on the client side and compare with the checksum computed by the server when it receives the file. If the checksums match, we can be very confident that the file uploaded correctly. Many other clients for other protocols (e.g. sFTP and SMB) do not do this.
...
If any of these fail, the file does not pre-exist and will be re-uploaded. In the case that the path/name is the same, but the source file has changed content, it will be uploaded to the pre-existing asset in Mediaflux as a new version.
Scheduled uploads
If you have a location that should be uploaded on a regular schedule such as an instrument PC that saves data to a given directory on the local computer, you can schedule uploads with unimelb-mf-upload. It is best to request an upload token if you want to do this as the credential will be stored on the computer that is doing the uploads. Contact Research Computing Services to request a token.
Windows
In this example:
- we will put the unimelb-mf-client files in the %HOMEPATH%\Documents directory
- we will save logs to the %HOMEPATH%\Documents\logs directory
- will will put the configuration file in the %HOMEPATH%\Documents directory
Download from the GitLab page, selecting the Windows 64-bit release. Extract the zip file to %HOMEPATH%\Documents.
Create a batch file to perform the upload using Notepad. In our example, it will be stored in %HOMEPATH%\Documents\upload.bat:
Code Block |
---|
%HOMEPATH%\Documents\unimelb-mf-clients-0.7.7\bin\windows\unimelb-mf-upload --mf.config %HOMEPATH%\Documents\mflux.cfg --log-dir %HOMEPATH%\Documents\logs --dest /projects/proj-demonstration-1128.4.15 %HOMEPATH%\Documents\data-to-upload |
Schedule the upload using Windows Task Scheduler. Click the start button and start typing Task Scheduler and select it from the Start Menu when it appears. Click on the Task Scheduler Library, then right click on the space and choose Create Basic Task... from the menu. Give your task a name and description, then click Next > choose a start date and time and click Next >, choose Start a program and click Next >, then click the Browse button and find the script you created above. Click Next > and then check the Open the Properties dialog for this task when I click Finish box, then click Finish. Under the Security options box, choose which user you would like the task to run under. You may wish to make it so the scheduled job will run even if the user is not logged in.