Skip to end of metadata
Go to start of metadata


Introduction

To simulate the use of an S3 storage in your Business Processes, Intelligent Automation Cloud Business offers a local File Storage, which is installed and configured on your machine during the product setup.

For RPA Express 2.3.0 and later, File Storage is available only in case of the Development Workstation and Server (Free + Premium Features) installation type.

There is a preconfigured common bucket called Public, with Read/Write access, so you are able to upload/download files there.

File Storage

In order to create a new bucket or use the existing one to upload files, you should use the local File Storage, which is installed with Intelligent Automation Cloud Business. To get access to File Storage, do as follows.

  1. Go to Components and enable server components.

     For RPA Express 2.2.x and earlier

    Select File Storage from the RPA Express tray menu.

  2. Log in using your credentials.

    The credentials for File Storage are defined during installation. See Intelligent Automation Cloud Express/Business Installation for more details.

  3. Upon login, you can browse the content of File Storage, create new buckets, or upload/delete files.

    The other buckets are for internal purposes of Intelligent Automation Cloud Business. Do not use, modify, or delete them!

Creating Bucket

You can create your own buckets for your Business Processes, if needed.

  1. Click the plus button.
  2. Select Create bucket.
  3. Enter the name for your bucket.
  4. Set Bucket Policy (by default a new bucket is created as read-only).
    1. Hover the bucket and click the Options icon.
    2. Change the default read-only policy to the type you need for your Business Process.

      Add the asterisk (*) symbol to the Prefix field.

    3. Click Add to save your changes and close the window.

Uploading Files to Bucket

You can upload files for your Bot Tasks to your bucket. 

  1. Select the bucket to upload a file to. If you need to upload the file to a subfolder in the bucket, then you should choose the subfolder before.
  2. Click the plus button.
  3. Select Upload files.

  4. Navigate to the file you want to upload to the bucket.

    Note

    You can upload one file at a time!

    Now you can see the uploaded file in the bucket.

Deleting Files from Bucket

You can delete files from your bucket, if you don't need them anymore. 

  1. Select the bucket to delete a file from. If you need to delete the file from a subfolder in the bucket, then you should choose the subfolder.
  2. Select the file you want to delete, and click the Options icon, then select Delete.
  3. Click Delete to confirm deletion, or Cancel to abort the operation.

Using File Storage in Recording

For your recordings, you can use the Public bucket or create your own bucket as described above. When you create a new bucket, do not forget to set the Bucket policy.

Creating Folders

Folders in a bucket can be created either from a script or using the file system.

Creating Folder in File System

  1. Open a file manager, for example Windows Explorer.
  2. Go to your Intelligent Automation Cloud installation folder (by default, C:\IntelligentAutomationCloud\).
  3. Find the data folder (\IntelligentAutomationCloud\minio\data\).
  4. Find your bucket there (for example, public).
  5. Create a new folder in the bucket to upload your files to. The files can be uploaded or copied over the file system.

Uploading Files

 The files can be uploaded or copied over the file system.

Copying Files over File System

  1. Open a file manager, for example Windows Explorer.
  2. Go to your Intelligent Automation Cloud installation folder (by default, C:\IntelligentAutomationCloud\).
  3. Find the data folder (\IntelligentAutomationCloud\minio\data\).
  4. Go to your folder in the bucket (for example, \public\invoices2process\).
  5. Copy your files to the folder. Now, you can create a list with your files to be used as the input data for your Business Process.

Creating Input Data Files

 The files can be uploaded or copied over the file system.

Copying Files over File System

  1. Create a .csv file.
  2. Enter the column name, for example file_link.
  3. Open File Storage.
  4. Choose a file in your bucket, click the Options icon, and select Copy.
  5. Click Copy Link to copy to clipboard.
  6. Paste the link from the Clipboard in the .csv file. The pasted link looks as follows.

    http://localhost:15110/my-bucket/invoice1.png?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=workfusion%2F20190215%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20190215T130528Z&X-Amz-Expires=604800&X-Amz-SignedHeaders=host&X-Amz-Signature=576bfcd6fdd29ee4afcb3afba928a1c3c3b5b5682eb668ccd897beed0f71153a
  7. Remove all characters after the question mark, as they are not needed, for the link to have the following format:

    http://localhost:15110/my-bucket/invoice1.png
  8. Repeat the procedure for all files you are going to use as the Input data.
  9. Save and close the file.
  10. Use the file as Input data in your Business Process.

Opening Bucket to Public Access

  1. Open the folder where Minio is installed (for example, C:\rpaexpress\minio)
  2. Run the next command in minio_mc which adds your host as follows:

    minio_mc config host add myminio http://127.0.0.1:15110 %userkey% %passwordkey%
  3. Run the next command in minio_mc that applies the needed policy to your bucket.

    minio_mc policy public myminio/public

    Now you have open access to the file like this: localhost:15110/public/invoices2process/invoice1.png

Using File Storage in Bot Task

Below you can see an example of how to use the Public bucket in the script (Bot Task).

Sample S3 File Storage
<var-def name="variable_name">
	<s3 bucket="bucket_name">
		<s3-put-public path='myfiles/tiff/${document_uuid}.tiff' content="${tiff_content}"/>
    </s3>
</var-def>

Code execution results

  • Subfolder myfiles/tiff/ is created in the Public bucket.
  • A .tiff file is put there under the name defined in ${document_uuid}.
  • The link to the file is included to the export for the next step.
  • The procedure is repeated for all files from the input data.
  • No labels