« For Developers: Embedding, JS, APIs

Using your own cloud hosting

You can use your own cloud storage provider to store all your Micrio images and data in. This article explains how to connect your own AWS, Google or Azure accounts.

The Micrio service will need to have full access to a storage bucket inside your cloud hosting provider. When an image is uploaded in your organisation, it will be stored there, along with all processed IIIF data, uploaded assets, and published JSON data.

This is particularly interesting if you have a large amount of images, and don't want to rely on Micrio's own (Google-based) cloud hosting, or if you work with sensitive or private data and need to retain full data ownership and restrict access.

NOTE: This feature is available to Enterprise subscription accounts only.

Requirements

Since Micrio will need full read/write access to a storage account, it's good practise to make a dedicated Micrio storage bucket. Access will be given to Micrio using API keys, limiting Micrio's access to your infrastructure.

This article has instructions for linking your Micrio account to your own storage account.

Content Distribution Network

When you run a public image service, it's generally a good idea to have a CDN (Content Distribution Network) in front of your storage account. This makes sure all globally delivered content is cached locally to the end-user, so download times are optimized for speed.

Based on your cloud provider, check out these instructions on how to use their CDN capabilities:

However, if your data needs to be centralized and is limited, a CDN is not recommended.

1. Google Cloud

In order to get your own Google Storage account working with Micrio, follow these steps:

  1. Create your Micrio storage bucket in the Google Cloud Console. Make sure this bucket will only be used by Micrio, and doesn't contain any other unrelated data.

  2. Set the bucket to be publicly readable, meaning users can access the data from their browser. Using the gsutil command line tool, you can do this:

    gsutil defacl ch -u AllUsers:R gs://[your-bucket-name]
    
  3. Next, allow XHR Cross-Origin requests to this bucket. This allows users to directly upload data from the Micrio dashboard, and let the Micrio client viewer optimally download the image tiles.

    First, create a file called cors.json:

    [
      {
        "maxAgeSeconds": 3600,
        "method": ["GET", "HEAD", "PUT", "POST", "OPTIONS"],
        "origin": ["*"],
        "responseHeader": ["Content-Type", "x-ms-blob-type"]
      }
    ]
    

    And then set the CORS using the following gsutil command:

    gsutil cors set cors.json gs://[your-bucket-name]
    
  4. Next, in the Cloud Console dashboard, create a Service Account, which has full read and write access to the bucket. This Service Account will be used by Micrio only, and has to be kept private at all times.

    Go to IAM & AdminService Accounts, and select Create Service Account.

    Give it a name and description, and keep step 2 ("Grant this service account access to project") empty for now. In step 3, add optional additional Google Cloud users access to the Service Account.

  5. Now we're going to need the JSON Key to this Service Account, which will be given to Micrio.

    In your newly created Service Account, go to the Keys tab, and click Create new key. When asked about key type, select JSON:

    Creating your Google bucket private key

    Click Create to be offered a JSON file download. This can only be done once, so save the file in a secure location (or password manager suite).

  6. Next, allow the newly created Service Account to have read/write control of the storage bucket. In the Cloud Console, open Cloud Storage > Browser, and open the context menu on the far right of the bucket for use with Micrio. There, select Edit bucket permissions, and there click Add member.

    In the new panel, search for your newly created Service Account, and select it. Then, under Select a role, search for Storage object admin:

    cdn-google

    Don't enter any conditions. Press Save.

  7. Everything is correctly set in place, congrats! Next step is to contact us, so we can link your storage key to your Micrio account. From then on, all Micrio uploads, assets and data will be fully stored on your storage bucket, independently.

  8. Optional: create a Google Cloud CDN in front of your bucket, to ensure global fast delivery of your imagery. Open Network ServicesCloud CDN, and set it up from there.

2. Microsoft Azure

In order to get your own Azure Storage account working with Micrio, follow these steps:

  1. In the Azure Portal, create a new Storage Account, which can be used exclusively by Micrio

  2. When the storage account has been created, go to the CORS panel, and set it as such:

azure

  1. Next, under Containers, create a new container. Give it a name like Micrio, and set the Public Access Level to Blob.

  2. You are almost already there. Go to Access keys, and click Show Keys. Copy the Connection String for your primary and secondary key to a secure location, such as a key vault.

  3. Next step is to contact us, so we can link your Azure bucket to your Micrio organisation account using the connection string from the previous step.

  4. Optional: create an Azure CDN in front of your bucket, to ensure global fast delivery of your imagery. Simply open the Azure CDN tab under Blob Service in your Storage dashboard, and set it up there.

3. Amazon Web Services

In order to get your own AWS S3 storage account working with Micrio, do the following:

  1. In your S3 AWS console, create a new bucket by clicking Create bucket.

  2. As a name, enter micrio (this is needed later on). For the region, select a region that is closest to your location, such as eu-north-1.

  3. Uncheck the "Block all public access" option, and keep all options under it unchecked as well. This makes the bucket publicly readable for your website visitors.

    aws-1

    Check the acknowledgement that you understand this (because you do).

  4. Keep Bucket Versioning disabled, add any optional tags (not required), and keep Default encryption disabled as well.

    Click Create bucket.

  5. In the next screen, you will see your newly created bucket. Open it by clicking it, then go to the Permissions tab.

  6. In the Bucket Policy part, click Edit, and paste this JSON fragment:

    {
        "Version": "2012-10-17",
        "Statement": [
            {
                "Sid": "PublicReadGetObject",
                "Effect": "Allow",
                "Principal": "*",
                "Action": "s3:GetObject",
                "Resource": "arn:aws:s3:::micrio/*"
            }
        ]
    }
    

    This makes the bucket publicly readable by default. Click Save changes.

  7. Next, scroll down to the Cross-origin resource sharing (CORS) part, and click Edit. Paste this JSON fragment:

    [
        {
            "AllowedHeaders": [
                "*"
            ],
            "AllowedMethods": [
                "PUT",
                "GET"
            ],
            "AllowedOrigins": [
                "*"
            ],
            "ExposeHeaders": [
                "x-amz-server-side-encryption",
                "x-amz-request-id",
                "x-amz-id-2"
            ],
            "MaxAgeSeconds": 3000
        }
    ]
    

    And click Save changes.

  8. Next, let's create an Access Key that will be shared with Micrio, so it can have full control of this bucket.

    Go to the IAM Dashboard , and click Add User. Name this user micrio, and select Programmatic access as type:

    aws-2

    Click Next to Permissions, and leave this blank. Another Next to Tags, and you can fill in some optional tags here. Next, click Create user.

  9. After the user is created, you see it in the list of users. Copy the Access key ID and Secret access key to a secure location, such as a key vault. You will need this later, and is important, because this is the only time you will see both keys without having to regenerate them.

  10. Open the Users page again, and click your newly created user. Copy the User ARN for this user (looks like arn:aws:iam::1234567890:user/micrio).

  11. Go back to the S3 Console. Open the Micrio bucket, and click Permissions. Scroll down to Bucket Policy, and click Edit.

    In the JSON we've added earlier, add another JSON object to the Statement array:

    {
        "Sid": "MicrioAccess",
        "Effect": "Allow",
        "Principal": {
    	    "AWS": "arn:aws:iam::1234567890:user/micrio"
        },
        "Action": "s3:*",
        "Resource": [
    	    "arn:aws:s3:::micrio",
    	    "arn:aws:s3:::micrio/*"
        ]
    }
    

    This grants the new Micrio user full read/write over the bucket.

    Click Save changes. If something is not correct, it will give you feedback about it.

  12. You're done! Next step is to contact us, so we can connect your S3 access keys to your Micrio organisation account. From there on, all Micrio uploads, data and assets will be stored there independently.

  13. Optional: create a CloudFront CDN and put it in front of your bucket, to ensure global fast delivery of your imagery.

✉ Can't find what you're looking for? Contact us!