I’ve always been on the fence when it comes to using a gui over the cli to transfer/update files on a remote server. For the majority of my work I use Fetch when I need to transfer or grab single files and sometimes I will even go as far as mirroring with Fetch to sync entire projects. There are better ways for sure, but I really wanted to see what else is out there and the benefits of each method/tool available (ones that don’t use a gui at all!). I decided to take to the tweeters in hopes that other developers would provide me with some feedback/opinions.
Why do some of you that prefer using Terminal over FTP like it so much? I’m researching the benefits at the moment and need feedback.— Dennis Gaebel (@gryghostvisuals) April 23, 2013
Chris Van Patten
There are a few options when it come to transferrring files without a GUI. Here are some of the top transferring commands that I know of currently using the CLI…
#Grabs files from a remote URI #wget http://grayghostvisuals.com/dir/somefile.png wget [url to your remote resource] # This command copies the files from a local machine to a remote server scp ~/Desktop/letter.png firstname.lastname@example.org:~/var/grayghostvisuals.com/ # Copy a file from remote to local machine scp email@example.com:~/var/grayghostvisuals.com/www/some-dir/some-image.png ~/Desktop/ #sync files or directories from your machine to a remote server # REf URI # http://code.seanodonnell.com/?id=38 # -r, -t and -v flags are shorthand for --recursive, --times (preserves timestamps), and --verbose rsync -rtv ~/Sites/myWebApp firstname.lastname@example.org:~/var/grayghostvisuals.com/www/sync-directory/
These 3 commands are the base to any file transfer from a terminal shell. It should be known that
rsync is not the best at tracking files like Git can achieve by detecting when files are no longer present or changed from the original source.
wget works great, but for single retrieval scenarios and one way communication -not to mention you need to know the URI any time you need to execute it.
scp allows files to be copied to, from, or between different hosts. That means it works great at two way communication. It uses ssh for data transfer and provides the same authentication and same level of security as ssh. I don’t know about you, but
scp and the rest of it sounds like a ton of typing compared to using a gui, but we still can’t sync an entire project or an array of sub directories properly. This is where SSH and POST HOOKS work great when using Git for deployment.
Dandelion for Deployment
I want to share with you a new workflow for deploying/pushing files and projects instead of using a gui for FTP/SFTP project/file transfers and syncing. This tool is gonna blow your mind. It’s called Dandelion and it will amaze you with its abilities and customization options. Just to be up front Dandelion has dependencies and yes there are quirks still so deal with it.
Dandelion is a deployment tool using SFTP to transfer, but via the cli and runs on Ruby. The reason you’ll want to give it a go is because you’ve run into scenarios where you wanna deploy your project and maybe you also want to deploy only certain files to a remote machine. Dandelion lets you control which files are mirrored and also takes it one step further by letting you config it for yourself. Simply running
dandelion deploy from the cli pushes your project dir/files to your remote machine specified. I’ve come across a few glitches though so I wanted to share and also include how to get going with this wonderful tool so let’s get it installed shall we?.
In order to run Dandelion we gottstah get it first right? Let’s visit Github and take a look at the repo https://github.com/scttnlsn/dandelion.
Currently as of this writing Dandelion only runs on
ruby 1.9.3 so if you’re using
rvm to do the task of version switching and managing simply run
rvm list rubies. For me it looks like this…
grayghostvisuals at GrayGhostVisualsMacBookAir.local ~ $ rvm list rubies rvm rubies ruby-1.9.3-p392 [ x86_64 ] =* ruby-2.0.0-p0 [ x86_64 ] ruby-2.0.0-p195 [ x86_64 ] # => - current # =* - current && default # * - default
As you can see my default is
2.0.0-p0 so I’ll need to choose the
1.9.3 version. In order to switch I need to run
rvm use ruby-1.9.3-p392 which will switch me over to the Ruby version required for Dandelion at this time. Once we have the correct version of Ruby we need to install the gems. Once again I use rvm so I need to execute the command
rvm create dandelion which gives me the gemset quarantined from other gemsets installed on my machine. Now I run
gem install dandelion as the README explains and BOOM!
Side Note: Although the repo does not explicitly say this you will need another little piece to get going called
net-sftp (2.1.1). This can be installed by running
gem install net-sftp, but the cli will give you a warning so just do what it says.
Dandelion Going Further
Now that the sweat has cleared from our eyes after all this installation nonsense we can get down to the heart of the tool…deploying. Dandelion provides some cool tricks to make life simpler. For my current situation I’m using Fetch for my SFTP/FTP gui. Sometimes I use ssh to run a
git pull command from my remote machine, but I don’t do much mirroring with something like
rsync. Also with Fetch I mirror and then check the option to delete stray items that no longer exist. I’ve also tried doing a post hook with Github that works pretty well, but still no control over what I want to do the most -control the output of what is sent upstream to the live server.
I can use Dandelion’s config file properly titled
dandelion.yml to tell dandelion what and what not to upload to my destintaion of choice. Pretty cool. So let’s say I have a project that looks something like the following…
root |_____index.html |_____.sass-cache |_____README.md |_____config.rb |_____js/ |_____css/ |_____assets/ |_____protected/
With Dandelion I can say “Hey, When you upload those files to the server don’t upload the directory “protected”, but keep it in the repo for the team to share.” This way that directory stays on a private Github repo for instance, but keeps out of a client’s server. I think that deserves a high-five. Of course it’s easy to keep out dotfiles with FTP mirroring but what about that nasty little compass config file? the README? that’s not a dotfile is it? doesn’t look like one to me so let’s keep those items out of the client’s server too just for good measure using Dandelion’s config file.
If you’re using Media Temple I suggest following closely with my setup below as I ran into a quirk where dandelion was deploying to the
.home directory instead of my actual
home directory until I found the solution in this issue.
# Required # -------- scheme: sftp host: XXXXXX.gridserver.com username: XXXXXX password: XXXXXX # Optional # -------- # Remote path # When using w/MediaTemple make sure to have this path structure # to avoid stuff going into a symlinked .home directory. # Also the forward slash before “home” is very important to have. # path: /home/XXXXXX/var/grayghostvisuals.com/html/ port: 22 # These files (from Git) will not be uploaded during a deploy exclude: - .gitignore - dandelion.yml - .sass-cache - README.md - codekit-config.json - config.rb - .gitattributes # These files (from your working directory) will be uploaded on every deploy # additional: # - dir/
As you’ve noticed in the config I can make sure certain directories are not sent and more importantly keep them in the repo for the team to share with one another.
Do you have another system you prefer or ways to make all this nonsense I’m doing even cooler? Let me know in the comments.
- Web Design Weekly : http://web-design-weekly.com/screencasts/deploy-sites-via-github/#more-2606
- The cognitive style of Unix : http://blog.vivekhaldar.com/post/3339907908/the-cognitive-style-of-unix
- Example syntax for Secure Copy (scp) : http://www.hypexr.org/linux_scp_help.php
- WGET GNU Documentation : http://www.gnu.org/software/wget/manual/wget.html