# Create bucket
# List bucket
Note : This is a Class A operation
# Copying from local to bucket
Below command initiates a parallel upload via rsync and recursively uploads all folder and sub-folders to GCS
Below command initiates a parallel upload excluding the pattern via rsync and recursively uploads all folder and sub-folders to GCS. It excludes folders like node_modules, .git, .cache and …
If you get Caught non-retryable exception while listing...
error like below that because in windows MAX_PATH defaults to 260 characters.
To increase that limit, you have two options,
- Update Windows Registry
Computer\HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\FileSystem\LongPathsEnabled
Type: REG_DWORD
Value: 1
Tried this and it works.
- Specify UNC Paths. Prefix paths with ”\?“. This changes maximum path names to 32,767 characters.
\\?\C:\Users\~~~~\Documents
Did not experiment on this. Good to know there is another option.
# Copying from bucket to local
If you want to download the entire bucket, you can do like this. It will create a folder with the buckets name in the current folder.
If you want to download to a specific folder, that folder needs to exist in target. Here i have created a new folder called t1