The Blog - Expert Thoughts & Opinions

Setting Jenkins on AWS as a CI server for Salesforce

Why Continuous Integration?

Before we get to the guide how to set up Jenkins as on AWS we need to answer why do we even need a continuous integration server when doing Salesforce development? Why not keep using changesets to deploy stuff? The simple answer is automation, the bigger your Salesforce development team the more difficult it would be deploying changesets between sandboxes. It would be even more difficult when 2 developers are working on the same components. A continuous integration server would simply get the changes from each developer, deploy them to the shared sandboxes and will run all tests. When a developer wants to get all of the changes that were done by other developers they just pull all the recent changes from the code repository.

The best development practice is to make sure every developer uses their own development sandbox, then we also use 2 shared environments, one called staging which we use to run all tests and to do QA. The second shared environment is a full sandbox which we call PreProd this is where the customer performs UAT. Since Salesforce now allows to have 25 developer sandboxes on enterprise edition there’s no reason for more than 1 developer to be working on the same sandbox.

The process works by having a shared code repository, once a developer is done working on a feature the developer will commit the changes to the shared code repository. Jenkins will pick this up, deploy the code to the staging environment and run all tests. If a test failed, Jenkins will mark the deployment build as failed and the developer would have to fix the issue. Ideally, developers should run all tests on their local sandbox and not push their code if there are failing tests but this doesn’t always happen. If this is successful the code is also deployed to PreProd. You can select when and how you’d like to deploy to production. Developers can always pull the recent changes from the code repository and deploy it to their sandbox, this is how developers should get changes that others did.

Setting Up an AWS Server

We choose to host our Jenkins on AWS, because at the size we need AWS can be free and also because of the flexibility and scaling option AWS offers.

 1. The first step is to create your AWS account and login – https://aws.amazon.com

2. Select your region where you want AWS to host your server. Just choose the closest location to you, it doesn’t really matter for a CI server.

3. From the main AWS page select EC2

4. Then go to Instances section and click “Launch Instance”

5. Select Ubuntu image

6. Select instance Type. We recommend selecting micro since it’s free and you probably won’t need a bigger instance if you do it’s possible to change it later on.

7. Then Add storage

8. Choose if you want additional storage for your server. We went for 20GB but this is mainly if you have a big repository, if you don’t then this is not needed.

9. Give your instance a name. “Jenkins” would make sense but feel free to be creative.

10. Create a new security group. Add Inbound firewall rules.

 

11. Allow only Ports 22, 80, 443. You can limit your server to be accessed only by certain IPs. We allowed Jenkins to be accessed from anywhere since you will need a username and password anyway to access Jenkins.

12. It’s time to launch your server:

13. Give a name to Key Pair and click in Download Key Pair. This is important to be able to access your server.

14. Then Save Key Pair

15. And Finally, Launch Instance.

16. Your instance is now set up and it has a public DNS address. Keep it as you’ll need it later. 

That’s it! our AWS server is up and running.

Installing Jenkins on the AWS Server

In order to install Jenkins on the AWS server, we’ll need to access our server. We downloaded earlier a .pem file. We’ll use it to access our server.

If you’re using Linux you can access the server by running this command

$ ssh  -i  myKeyFile.pem ubuntu@theDnsAddressOfYourAwsServer

And then jump to step 10.

If you’re using Windows we’ll need to use Putty to connect to our AWS server.

1. In windows, we need to convert the .pem file to .ppk file.

To convert we will use PuttyGen.

You can download PuttyGen from https://the.earth.li/~sgtatham/putty/latest/x86/puttygen.exe

2. Open PuttyGen and load the key ‘.pem’ key file.

3. Browse and look for your .pem file and hit open.

4. Then click save Private key.

Save without pass phrase.

Now once we have a .ppk file we need Putty to access our server.

You can download putty from https://the.earth.li/~sgtatham/putty/latest/x86/putty.exe

5. Open Putty and in Host Name put ubuntu@yourAwsDnsAddress (in the photo we put the instance IP there but it’s easier to use your DNS address)

This is the same DNS address that is highlighted earlier in section 16.

6. Go to Auth section and Browse for the .ppk file.

7. Now go to session section and give a Name and save. This would allow you to connect to the server easier the next time.

8. Now click on “Open” to access your AWS server.

9. You should see this Windows opened.

10. Run the following commands to install Tomcat.

sudo apt-get update
sudo apt-get install vim
sudo apt-get install default-jdk
sudo apt-get install tomcat7
sudo chown -R tomcat7.tomcat7 /usr/share/tomcat7
sudo rm -rf  /var/lib/tomcat7/webapps/*
sudo apt-get install git
sudo apt-get install ant
11. Get latest Jenkins.war to install Jenkins
wget https://updates.jenkins-ci.org/download/war/2.6/jenkins.war
12. Move Jenkins.war to root and then start Jenkins
sudo mv jenkins.war /var/lib/tomcat7/webapps/ROOT.war
sudo service tomcat7 restart
13. Now Tomcat is running on port 8080 with Jenkins. Now we’ll install ngnix to redirect port 80 to it
sudo apt-get install nginx
14. We’ll also need to edit the ngnix configurations file, use this command to open for edit mode the ngnix configuration file
sudo vim /etc/nginx/sites-enabled/default
Use “i” to start editing the content. You can delete all the content and paste the following text. At the end to save press Esc and then type “wq!” and hit enter.
upstream jenkins {
  server 127.0.0.1:8080 fail_timeout=0;
 }
 server {
  listen 80 default;
  server_name localhost	;
  location / {
   proxy_set_header Host $http_host;
   proxy_set_header X-Real-IP $remote_addr;
   proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
   add_header Pragma "no-cache";
   proxy_pass http://jenkins;
  }
 }
15. Now to access your Jenkins use  http://yourAwsDnsAddress

Note 1:

You can use your own URL like http://jenkins.myCompany.com so it will be easy to access your server. You do that by adding a DNS record to your server.

Here are details how to do that: http://docs.aws.amazon.com/AWSEC2/latest/UserGuide/dynamic-dns.html

Note 2:

Our Jenkins is running without SSL which means the communication with it is not encrypted. You can create your own SSL certificate (or use one you already have) to make your Jenkins more secure. You can configure your ngnix server to do that.

Here are further details on how to do that:

https://www.digitalocean.com/community/tutorials/how-to-create-an-ssl-certificate-on-nginx-for-ubuntu-14-04 

When you access that URL you should see this page

16. To get Jenkins Admin password using this command

sudo cat /usr/share/tomcat7/.jenkins/secrets/initialAdminPassword

17. Enter the password that is displayed and then you’ll get this

Click on install suggested plugins, some of them are very handy and one of them is the Ant plugin which we’ll need.

You’ll see this setup page

And then you’ll be able to create your first Jenkins user

18. And that’s it! Jenkins is ready to go.

Creating a build pipeline for Salesforce

Now we have Jenkins up and running. All that’s left to do is to add builds to deploy to Salesforce.

I’m assuming you have a code repository, somewhere like GitHub. If you don’t you definitely need to create one where you’ll store all of your Apex classes and Visualforce pages.

1. We’ll add a new folder to our repo called “ant” with a few files required for the deploy. It doesn’t have to be included in the folder but we prefer to have it there so that every developer that clones the repo has the ant deploy files. (Although we usually use Maven Mate with sublime so they’re not really needed).

To the Ant folder we’ll and the ant-salesforce.jar file you can download from Salesforce here:

https://gs0.salesforce.com/dwnld/SfdcAnt/salesforce_ant_36.0.zip

We’ll also add the standard package.xml file (in addition to the file we have in the “src” folder) to the ant folder.

Use this:

xml version="1.0" encoding="UTF-8"?>
<Package xmlns="http://soap.sforce.com/2006/04/metadata">
    <types>
        <members>*</members>
        <name>ApexClass</name>
    </types>
    <types>
        <members>*</members>
        <name>ApexComponent</name>
    </types>
    <types>
        <members>*</members>
        <name>ApexPage</name>
    </types>
    <types>
        <members>*</members>
        <name>ApexTrigger</name>
    </types>
    <types>
        <members>*</members>
        <name>StaticResource</name>
    </types>
    <version>36.0</version>
</Package>
And the last file is the build.xml, use this file:
<project name="Salesforce Development Ant tasks" default="deploy_run_tests" basedir="." xmlns:sf="antlib:com.salesforce">

  <taskdef uri="antlib:com.salesforce"
       resource="com/salesforce/antlib.xml"
       classpath="./ant-salesforce.jar"/>

  <property file="build.properties"/>
  <property environment="env"/>

  <target name="deploy_run_tests">
    <mkdir dir="../runtest" />
    <copy file="package.xml" todir="../src/" overwrite="true" failonerror="true"/>
    <sf:deploy username="${env.SFUSER}" password="${env.SFPASS}${env.SFTOKEN}" serverurl="${env.SF_SANDBOXURL}" deployRoot="../src" runAllTests="true" maxPoll="10000"/>
  </target>

  <target name="deploy_dont_run_tests">
    <antcall target="ExecutePreDeployApexSandbox"/>
    <copy file="package.xml" todir="../src/" overwrite="true" failonerror="true"/>
    <sf:deploy username="${env.SFUSER}" password="${env.SFPASS}${env.SFTOKEN}" serverurl="${env.SF_SANDBOXURL}" deployRoot="../src" runAllTests="false" maxPoll="10000"/>
  </target>

</project>
2. Before we create a build item in Jenkins we want to connect our Github to Jenkins. Go to “Manage Jenkins” -> “Configure System”. Under the “GitHub” section click on “Advanced” and then in “Additional actions” select “Convert login and password to token”. Select the “From login and password” radio button and enter your Github username as password. Then click “Create token credentials”. You should get a success message.

Now click on “Add Github Server” and from the “Credentials” drop-down add the token credentials you just created.

3. We also need to configure Git in Jenkins. Go to “Manage Jenkins” -> “Global Tool configuration” and under the Git section add “default” as the name and “git” as out path to git.

4. In Jenkins click on  “New Item”, enter “Staging Deploy” as the build name and select “Freestyle Project”

5. In the new build page:

Tick the “GitHub project” checkbox and put your repository URL, mine for this test is https://github.com/orweissler/SalesforceTest.git

Also, tick the “This project is parameterized” checkbox. Add a String parameter with the name of “SFUSER” and enter your Salesforce username in your staging sandbox.

Add another password parameter called SFPASS with your Salesforce password.

A password parameter called “SFTOKEN” with your Salesforce token.

And the last one, a string parameter called “SF_SANDBOXURL” with the value “test.salesforce.com”. All of these parameters will go into your build.xml file.

 

 6. Under the “Source Code Management” section select “Git”. In “Repository URL” enter your Github SSH URL, mine, for example, is “git@github.com:orweissler/SalesforceTest.git”.

In the credentials, click add, then select “SSH Username with private key” and paste your private Github SSH key. If you don’t have an SSH key on Github this is how you can create on and upload it to you Github https://help.github.com/articles/generating-an-ssh-key

Jenkins will tell you if it can access your git repository or not.

Under “Build Triggers” select “Build when a change is pushed to GitHub”.

7. We’re still not done! Under “Build” select “Add build step” and choose “Invoke Ant”. In “Targets” put “deploy_run_tests”.

Click “Advanced” and in “Build File” put “ant/build.xml”

Don’t forget to save!

8. Now we’ll create the same build for our PreProd environment. We’ll repeat the same steps in sections 5-7 but we won’t select “Build when a change is pushed to GitHub”, we will enter our PreProd environment username, password and token. Last difference is instead of running the deploy_run_tests job we can just use the deploy_dont_run_tests ant job since this would be the follow up build, we know our tests pass.

9. Go back to Staging configuration and under “Post-build Actions” add “Build other projects”. Select your PreProd build and “Trigger even if the build is unstable” as we want to build PreProd as long as Staging tests pass and the build is successful.

10. And that’s it! Now every push to your repository by any developer would push the code automatically to your shared staging environment, make sure all tests pass and then if everything goes well it will push it to your PreProd environment so users can perform UAT on your features. You can customise your build to fit your needs, you can add a live deploy here as well and schedule it or start is manual.

Credits for helping with this article:

Bhrigun Prasad