Cloud object storage is a powerful tool for storing and delivering media, software, and other digital assets. Evolving enterprise requires scalable, long-term storage for a wide variety of digital objects. Amazon S3 (Simple Storage Service) revolutionized cloud object storage when it launched in 2006. The S3 API has since become almost a de facto standard for object storage design, and now many cloud providers are offering compatible services and solutions. CenturyLink Cloud Object Storage is one such service – providing enterprise-grade object storage in a highly-scalable, fault-tolerant, distributed datastore.

Although Object Storage is fully-compatible with S3, not every S3-compatible library will connect easily to our API. In this tutorial, we will look at the best libraries for utilizing Object Storage. We will examine libraries for four popular programming languages: Java, Node.js, Go, and PHP.

Tools Used

CenturyLink Cloud Object Storage offers enterprise-grade object storage. Our cloud servers store and manage your files in a highly-scalable, fault-tolerant, distributed datastore. For large-scale cloud applications, Object Storage is far more efficient than hierarchical file systems.

Adding Object Storage to Your Account

Follow the steps below to use the CenturyLink Cloud Console to add an Object Storage bucket and user to your account.

Before We Start

If you don’t have a CenturyLink Cloud account yet, head over to our website and sign up for a free trial. You’ll need it to access CenturyLink Blueprints and deploy virtual servers.

Creating Object Storage Users

To create a new Object Storage user, follow the steps below:

  1. After logging into the CenturyLink Cloud Console, navigate to Object Storage from the top drop-down menu.

    CenturyLink Cloud Control Panel

  2. On the Object Storage page, click the Users tab.

    Object Storage Users tab

  3. Click the create user button and enter the user information.

    Note: The email address for the user must be unique across the Object Storage platform and cannot be reused.

    Object Storage create user

  4. Click save.

  5. Click the newly-created user record to view the access key id and secret access key values, which act as the username and password for this Object Storage user. Save these values for later use.

    Object Storage user details

Creating Object Storage Buckets

To create a new Object Storage bucket for storing digital assets, follow the steps below.

  1. After logging into the CenturyLink Cloud Console, navigate to Object Storage from the top drop-down menu.

    CenturyLink Cloud control portal

  2. On the Object Storage page, navigate to the Buckets tab.

  3. Click the +create bucket button.

    Object Storage create bucket

  4. Fill out the "Create Bucket" form.

    The bucket name should start and end with lowercase letters or numbers, and can only contain lowercase letters, numbers, dashes, and dots.

    Note: This value must be unique globally across the Object Storage system.

    Object Storage bucket details

  5. Click the save button to create the bucket.

Additional Object Storage Operations

For more information on managing Object Storage buckets and users from the Console, check out this article in our knowledge base.

Endpoints, Regions, and Buckets

If you haven't used Object Storage or other S3-compatible systems before, some of the terminology might be unfamiliar.

An object is an individual digital asset. This can be any sort of data from text to video to images.

A bucket is a resource for holding objects. Buckets have simple names consisting of letters, numbers, and some punctuation. Any characters that are valid in a DNS hostname should be valid in a bucket name.

A region is a data center or cloud service area that hosts Object Storage buckets. One region contains any number of buckets.

An endpoint is a hostname that serves the Object Storage API for a region. Each region has one or more endpoints, and each endpoint belongs to a region. When selecting a library to access the Object Storage endpoint, it is critical to find one that is not only S3-compatible, but allows the developer to specify a custom endpoint.

Throughout this article, we will use the "Canada" region in the examples. However, if you choose to use US-East as the data center for your buckets, you will need to change all references "" to "".

CenturyLink Cloud Object Storage Regions


The AWS libraries for Java can be reconfigured to access Object Storage. To add S3 support to your existing Java project, follow the directions below:

Add Maven dependencies

If you are using Maven, add the following dependencies to pom.xml.



Listing Objects in a Bucket

The following example will list all objects in a bucket. Replace the private members of the TestS3Compat class with the values you collected in "Adding Object Storage to Your Account."

import com.amazonaws.ClientConfiguration;
import com.amazonaws.AmazonClientException;
import com.amazonaws.AmazonServiceException;
import com.amazonaws.auth.BasicAWSCredentials;
import com.amazonaws.auth.AWSCredentials;

public class TestS3Compat {
    private static String bucketName = "YOUR-BUCKET-NAME",
    private static String ctlAccessKey = "YOUR-ACCESS-KEY";
    private static String ctlSecretKey = "YOUR-SECRET-KEY";
    private static String ctlEndpoint = "";

    public static void main(String...args) throws Exception {
        AWSCredentials awsCredentials = new BasicAWSCredentials(ctlAccessKey, ctlSecretKey);

        // CTL Object Storage uses a different request signature than AWS S3.
        ClientConfiguration clientConfig = new ClientConfiguration();

        // Configure our S3 client to use the correct CTL endpoint.
        AmazonS3Client s3client = new AmazonS3Client(awsCredentials, clientConfig);

        try {
            System.out.println("Listing objects in " + bucketName);
            final ObjectListing result = s3client.listObjects(bucketName);
            do {
                for (S3ObjectSummary objectSummary : result.getObjectSummaries()) {
                    System.out.println(" - " + objectSummary.getKey() + "  " +
                            "(size = " + objectSummary.getSize() + ")");
            } while(result.isTruncated() == true );

        } catch (AmazonServiceException ase) {
            System.out.println("Caught a service exception, which means your request made it " +
                    "to CenturyLink Cloud Object Storage, but was rejected with an error response for some reason.");
            System.out.println("Error Message:    " + ase.getMessage());
            System.out.println("HTTP Status Code: " + ase.getStatusCode());
            System.out.println("AWS Error Code:   " + ase.getErrorCode());
            System.out.println("Error Type:       " + ase.getErrorType());
            System.out.println("Request ID:       " + ase.getRequestId());
        } catch (AmazonClientException ace) {
            System.out.println("Caught a client exception, which means the client encountered " +
                    "an internal error while trying to communicate with Object Storage, such as not being able to access the network.");
            System.out.println("Error Message: " + ace.getMessage());


There are several S3-compatible libraries for Node.js that allow you to specify custom endpoints. Two that fill our requirements are Knox and the simply-named s3 library from Andrew Kelley. Unfortunately, because of some fundamental problems with the underlying Amazon SDK, s3 is no longer supported.

The Knox library from Automattic is a very solid choice. It is produced and developed by Automattic, who are well-known for releasing solid, stable software. One of the main drawbacks of Knox is that it doesn't natively support multi-part uploads for large files. However, the library documentation contains a list of supplementary libraries that implement many other popular S3 features that might be needed.

We used the Knox library to build our Node.js tutorial application. You can check that out for a more complete example, but getting started with Knox takes just a couple of steps.

  1. In your Node.js project directory, run npm install knox --save to add Knox to your project's list of dependencies.

  2. Create a Knox client in your application using code similar to the following:

        /* Object Storage configuration */
        var knox = require('knox'),
            client = knox.createClient({
              key      : "YOUR-ACCESS-KEY",
              secret   : "YOUR-SECRET-KEY",
              bucket   : "your-bucket-name",
              endpoint : ""


In case you haven't heard of it yet, Go is a compiled, statically-typed language released by Google in 2009. It has a range of powerful features, such as garbage collection, concurrency, and limited type inference. One library that does a good job supporting the Amazon S3 interface is Mitchell Hashimoto's port of goamz. This library covers a good selection of the S3 API. The following is a complete example of a short program to list the buckets in Object Storage.

  1. In a new project directory, create a file called bucketlist.go and edit it to look like this:

        package main
        import (
        func main() {
                ctlCanada := aws.Region{Name: "Canada", S3Endpoint: ""}
                //ctlUSEast := aws.Region{Name: "USEast", S3Endpoint: ""};
                auth := aws.Auth{AccessKey: "YOUR-ACCESS-KEY", SecretKey: "YOUR-SECRET-KEY"}
                svc := s3.New(auth, ctlCanada)
                buckets, err := svc.ListBuckets()
                if nil != err {
                for _, b := range buckets.Buckets {
                        fmt.Printf("- %s\n", b.Name)
  2. Run the following commands:

    $ go get -u
    $ go build -o bucketlist
  3. You can now run ./bucketlist and get a list of Object Storage buckets for your account.


There are several solutions for S3 available out there, but the simplest and easiest to set up is Donovan Schönknecht's S3 REST client class. For this example, you will want to have the PHP cURL library installed on your system, as well as the Composer dependency manager.

  1. In your project directory, make a composer.json file that looks like this:

          "require": {
            "tpyo/amazon-s3-php-class": "*"
  2. Run composer install or, on some systems, php compose.phar install.

  3. Create a file called listbuckets.php and edit it to look like this:

        require __DIR__ . '/vendor/autoload.php';
        $ctlAccessKey = "YOUR-ACCESS-KEY";
        $ctlSecretKey = "YOUR-SECRET-KEY";
        $ctlCanada = "";
        $s3 = new S3($ctlAccessKey, $ctlSecretKey, true, $ctlCanada);
        foreach ($s3->listBuckets() as $bucket) {
            echo "- " . $bucket . "\n";
  4. Run the program from the command line like this: php listbuckets.php


In this article, you learned how to use S3-compatible libraries for three different languages to bring the power of CenturyLink Cloud Object Storage to your application. Storing digital assets in the cloud adds new levels of versatility and speed to an application. These digital assets benefit from the enhanced redunancy and fast delivery that CenturyLink Cloud brings to all of its cloud product offerings.

Sign-up for our Developer-focused newsletter CODE. It's designed hands-on by developers, for developers. Keep up-to-date on topics of interest, including: tutorials, tips and tricks, and community building events.

We’re a different kind of cloud provider – let us show you why.