Skip to content
GitLab
Explore
Sign in
Primary navigation
Search or go to…
Project
R
remote_tree_to_local_tars
Manage
Activity
Members
Labels
Plan
Issues
Issue boards
Milestones
Wiki
Code
Merge requests
Repository
Branches
Commits
Tags
Repository graph
Compare revisions
Snippets
Build
Pipelines
Jobs
Pipeline schedules
Artifacts
Deploy
Releases
Container Registry
Model registry
Operate
Environments
Monitor
Incidents
Analyze
Value stream analytics
Contributor analytics
CI/CD analytics
Repository analytics
Model experiments
Help
Help
Support
GitLab documentation
Compare GitLab plans
Community forum
Contribute to GitLab
Provide feedback
Keyboard shortcuts
?
Snippets
Groups
Projects
Show more breadcrumbs
Gary Ruben
remote_tree_to_local_tars
Commits
2adc8740
Commit
2adc8740
authored
6 years ago
by
Gary Ruben
Browse files
Options
Downloads
Patches
Plain Diff
Tidying and documentation improvement
parent
56d7f629
No related branches found
Branches containing commit
No related tags found
No related merge requests found
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
asci_to_vault.py
+47
-29
47 additions, 29 deletions
asci_to_vault.py
with
47 additions
and
29 deletions
asci_to_vault.py
+
47
−
29
View file @
2adc8740
"""
A script to transfer a tree of data files from a remote server to a local
computer. This only runs on a destination un*x system and requires an ssh key
pair to be shared between the systems. See
A script to transfer a tree of data files from a remote/source server to a
local/destination computer. This runs on the local Linux machine, on which the
tape archive system is mounted; in our case, this is a machine at Monash. Prior
to running this an ssh key pair must be shared between the systems. See
https://confluence.apps.monash.edu/display/XI/Australian+Synchrotron
for details on how to do this between a Monash Linux machine and ASCI.
for details on how to do this between a Monash Linux machine and ASCI
(Australian Synchrotron Compute Infrastructure). Requires Python 3.7 or higher
and uses the fabric module.
Authors:
gary.ruben@monash.edu
michelle.croughan@monash.edu
Note that current version creates two files in the same directory as this script
1. A .log file named based on the start-time timestamp which is a capture of all
stdout activity.
2. A Python pickle file named tree_state.pickle that contains the transfer state
from which failed transfers can be restarted by setting the READ_PICKLE_FILE
file to True.
Known issues
------------
Note: The current version of fabric generates warnings. This issue is discussed
here: https://github.com/paramiko/paramiko/issues/1369
Notes
-----
This is a possible option for checksumming.
https://stackoverflow.com/q/45819356/
KERNEL_CHECKSUM=$(cpio --to-stdout -i kernel.fat16 < archive.cpio | sha256sum | awk
'
{print $1}
'
)
"""
import
os
...
...
@@ -15,31 +41,22 @@ import subprocess
import
pickle
import
pprint
import
time
from
fabric
import
Connection
# This isn't suppressing the warnings that fabric is generating; we need to
# investigate further
with
warnings
.
catch_warnings
():
warnings
.
simplefilter
(
"
ignore
"
)
import
fabric
from
fabric
import
Connection
"""
This is a possible option for checksumming.
https://stackoverflow.com/q/45819356/
KERNEL_CHECKSUM=$(cpio --to-stdout -i kernel.fat16 < archive.cpio | sha256sum | awk
'
{print $1}
'
)
"""
READ_PICKLE_FILE
=
False
EXPERIMENT_NAME
=
"
13660a
"
PICKLE_FILENAME
=
os
.
path
.
join
(
os
.
path
.
dirname
(
__file__
),
"
tree_state.pickle
"
)
timestamp
=
time
.
strftime
(
"
%Y-%m-%d-%H%M%S
"
)
LOG_FILENAME
=
os
.
path
.
join
(
os
.
path
.
dirname
(
__file__
),
f
"
{
EXPERIMENT_NAME
}
-
{
timestamp
}
.log
"
)
LOG_FILENAME
=
os
.
path
.
join
(
os
.
path
.
dirname
(
__file__
),
f
"
{
EXPERIMENT_NAME
}
-
{
timestamp
}
.log
"
)
REMOTE_LOGIN
=
"
gary.ruben@monash.edu@sftp.synchrotron.org.au
"
# SRC_PATH = "/data/13660a/asci/input"
SRC_PATH
=
"
/data/13660a/asci/output/tar_test
"
DEST_PATH
=
"
/home/grub0002/bapcxi/vault/rubbish
"
SRC_PATH
=
"
/data/13660a/asci/input
"
# SRC_PATH = "/data/13660a/asci/output/tar_test"
# DEST_PATH = "/home/grub0002/bapcxi/vault/rubbish"
DEST_PATH
=
"
/home/grub0002/bapcxi/vault/IMBL_2018_Oct_McGillick
"
@dataclass
...
...
@@ -124,22 +141,27 @@ def tar_and_send_directory(node):
node
.
processed
=
True
if
__name__
==
"
__main__
"
:
sys
.
stdout
=
Logger
()
sys
.
stdout
=
Logger
()
# Log all stdout to a log file
# A hacky way to restart an interrupted transfer is to set
# READ_PICKLE_FILE = True above so that the transfer state is retrieved. By
# default the tree is built from scratch from the remote file system.
if
not
READ_PICKLE_FILE
:
# Get the directory tree from remote server as a list
with
Connection
(
REMOTE_LOGIN
)
as
c
:
result
=
c
.
run
(
f
'
find
{
SRC_PATH
}
-type d
'
)
remote_dirs
=
result
.
stdout
.
strip
().
split
(
'
\n
'
)
# Create a tree data structure that represents both source and
destination
# tree paths.
# Create a tree data structure that represents both source and
#
destination
tree paths.
tree
=
[]
for
src
in
remote_dirs
:
dest
=
src
.
replace
(
SRC_PATH
,
DEST_PATH
)
tree
.
append
(
Node
(
src
,
dest
))
else
:
# Read the saved transfer state from the locally pickled tree object.
with
open
(
PICKLE_FILENAME
,
'
rb
'
)
as
f
:
tree
=
pickle
.
load
(
f
)
print
(
'
tree:
'
)
...
...
@@ -155,7 +177,3 @@ if __name__ == "__main__":
# pickle the tree to keep a record of the processed state
with
open
(
PICKLE_FILENAME
,
'
wb
'
)
as
f
:
pickle
.
dump
(
tree
,
f
)
# possibly delete the 'tree_state.pickle' file here
# from IPython import embed; embed()
This diff is collapsed.
Click to expand it.
Preview
0%
Loading
Try again
or
attach a new file
.
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Save comment
Cancel
Please
register
or
sign in
to comment