Resources

Medium One Documentation

Getting Started Guide

Welcome to your Medium One web portal. Let's get you started with:

  • Account setup and user configuration
  • Send data
  • Build a real-time workflow
  • Inspect results

Follow along with the Getting Started video tutorial below, or scroll down for a text version of the instructions.


Step 1: Create an API Key

  • An API Key determines the privilege and visibility within the Medium One platform.
  • On the Setup → Manage API Keys page, click Add New API Key. Enter a brief description and make sure Enabled is checked.

Note: This is your newly created API Key that you will need to send events in a subsequent step.



Step 2: Create an API Basic User

  • Now you will need to create an API Basic User to send data into the cloud.
  • An API Basic User can create/retrieve/update/delete event data, retrieve/update their own user information, and fetch processed events.
  • In this tutorial, you will create the API Basic User through our web portal. They can also be created via REST API. For more details on how to do this, see REST API section.
  • On the Setup → Manage Users page, click Add New User and fill out the username and password info.
  • After creating a new user, you should now see it appear on the Manage Users page. You may have to refresh to see this change.
  • Save the credentials for the API Basic User as we will be using them in our next step.



Step 3: Send a Sample Event

  • Let's use CURL commands to send data created on behalf of the API Basic User.
  • If you are using a Windows computer, you will need to download CURL with SSL here.
    • You will also need the recent CA Certificates Do not skip this step
    • Download cacert.pem here. Rename it 'curl-ca-bbundle.crt' and place it in the same folder as curl.exe



  • First, we will need to login using CURL before sending the event.

    • In the CURL command below, replace < API Basic User > with the username you created earlier from Step 2. Be sure to keep the quotes around the username.
    • Replace < password > with the password for that API Basic User. Be sure to keep the quotes around the password.
    • Replace < API Key > with the API Key that you created in Step 1. Be sure to keep the quotes around the API Key.

    • Mac OS X or Linux

      curl -c cookies.txt -X POST -k -H 'Content-Type: application/json' -H 'Accept: application/json' -i 'https://api-sandbox.mediumone.com/v2/login' -d '{"login_id": "< API Basic User >", "password": "< password >", "api_key": "< API Key >"}'
    • Windows - keep all of the backslashes ( \ ) before the quotes.

      curl -c cookies.txt -X POST -k -H "Content-Type: application/json" -H "Accept: application/json" -i "https://api-sandbox.mediumone.com/v2/login" -d "{\"login_id\": \"< API Basic User >\", \"password\": \"< password >\", \"api_key\": \"< API Key >\"}"
    • If your login was successful, you will see a 200 message:

      HTTP/1.1 200 OK
      Server: nginx/1.4.6 (Ubuntu)
      Date: Tue, 01 Mar 2016 18:45:49 GMT
      Content-Type: application/json; charset=UTF-8
      Content-Length: 4
      Connection: keep-alive
    • Note: Using CURL, You will only need to login once every 24 hours.


  • After logging in, you can now send an event to the cloud using that API Basic User.

    • In this tutorial, we will be sending "force_strength":67 and "level":"apprentice" in the same event. You can find the CURL commands for this below:

    • Mac OS X or Linux

      curl -b cookies.txt -X POST -k -H 'Content-Type: application/json' -i 'https://api-sandbox.mediumone.com/v2/events/raw/<API Basic User>' -d '{"observed_at":"2015-03-04T12:00:00.0-0700", "event_data": {"force_strength":67, "level":"apprentice"}}'
    • Windows - keep all of the backslashes ( \ ) before the quotes.
      curl -b cookies.txt -X POST -k -H "Content-Type: application/json" -i "https://api-sandbox.mediumone.com/v2/events/raw/<API Basic User>" -d "{\"observed_at\":\"2015-03-04T12:00:00.0-0700\", \"event_data\": {\"force_strength\":67, \"level\":\"apprentice\"}}"
    • You should see the 200 success message.


  • Now that you've sent in an event, we need to Activate the tags to use them in workflows.



Step 4: Activate Tags

  • Tags are what we use to classify the "keys". They are each sent with a "value"
  • In the event you just sent in:
    • The tags are "force_strength" and "level"
    • The values are 67 and "apprentice"
  • To find out more about Tags and Values, click here.


  • On the Config →Data Streams page, click Edit for the Raw Data Stream.
  • Select the Active checkbox for the two listed Tags (force_strength and level), and then Save Data Stream.



Step 5: Create a Workflow

  • Workflows are used to process data from events to generate a new event.
    • These new events created from workflows are called processed events .
  • On the Workflow Studio page, click Create to create a new Workflow.
  • Once you created the new Workflow, click the Tags & Triggers pane on the right toolbar, and under the "raw" dropdown, drag and drop force_strength and level tags onto the main canvas.
    • These are the triggers that signify what runs the Workflow.
  • Next, from the Modules pane, click on the "Foundation" dropdown then drag and drop the Base Python module onto the canvas.
    • Base Python is a programmatic module that you can code in Python to process your data.
  • From the Outputs panel, drag and drop the Processed Data Stream module onto the canvas.
    • This output module signifies that your Workflow will generate a processed event everytime it is triggered.
  • Double click on the Base Python module box, and expand the IONodes section and click Add Input. A new row should appear for input "in2". Click Save.
  • With two inputs, you can connect your "force_strength" and "level" triggers to the Base Python module.
  • Connect the input and output connectors (click and drag) to make the Workflow look something like this:
  • Double click the Base Python module again, and let's replace the default script and enter the following into the script textarea:
new_strength = IONode.get_input('in1')['event_data']['value'] + 28 
new_level = "jedi"
output = IONode.set_output('out1', {'new_strength': new_strength, 'new_level': new_level})
  • This script outputs a processed events with tags "new_strength" and "new_level" everytime the Workflow runs:
    • "new_strength" is created by taking input 1 (in this case it is the "force_strength" tag) and adding 28 to that value.
    • "new_level" is set as "jedi"

Your Base Python module should now look like this:

  • Save the code in the Base Python module.
  • Lastly, you want to Activate the Workflow to make it go live.
    • Go to the Revisions panel, and Activate the Workflow by clicking on the check icon under the most recent revision.
    • A green "activated" icon should appear next to the revision name.
    • Your workflow is now live and will be generating an output whenever it is triggered.
  • In summary, this Workflow will run every time a "force_strength" or "level" tag is sent into the cloud. It will generate a processed event with "new_strength" and "new_level". "new_stregth" will be 28 more than the value of the "force_strength" that was sent in. "new_level" will be "jedi".

Note:

  • The star icon means this revision is favorited and it cannot be deleted. Activated revisions must always be starred and cannot be unstarred until it is dectivated.
  • In this example, the “in2” input is actually not used, and we chose to use output tag names that are different from the inputs.
  • Learn more about the Workflow Studio here.



Step 6: Triggering the Workflow to Generate a Processed Event

  • Now that you've created your Workflow, it's time to trigger it to generate a processed event.
  • Using CURL, send in the same exact event that you sent in from Step 3 again.
  • Sending this new event will automatically be processed by your newly created Workflow and will generate a new processed event.



Step 7: View Tags of Processed Stream

  • Next, you will verify that the processed event is created and sent to the processed data stream:
  • Similarly to Step 4, on the Config → Data Streams page, click Edit for Processed Events.
  • You will find the two new tags (new_strength and new_level) listed and already selected Active, so no changes are necessary.

Note: Processed events are always automatically selected as Active.



Step 8: Visualize Processed Data on Dashboard

  • Now, you can view your newly processed data on the Dashboard.
  • First, click on Dashboard on the navigation side bar.
  • Select the Single User Table widget from the widget selector.
  • From the Select User dropdown, select the username you gave to the API Basic User.
  • Click the configuration gear icon , and select the 4 tags (force_strength, level, new_strength, and new_level). If you don't see all 4 tags, you may need to refresh your browser.
  • You should now see both the raw and processed events in the single user table.


Learn more about the Dashboard Widgets here.



Congratulations, you have now created a Workflow and processed a simple data event. Now you are ready to connect your data sources, and get intelligent about your data.

Any questions? Contact us here: support@mediumone.com


Getting Started
User Roles & Permissions
Streams, Tags, & Data Types
Dashboard Widgets
API
Workflow Studio
Workflow Libraries
Metering
Special Characters
Third Party Integration
Mobile
API Explorer

Getting Started Guide

Welcome to your Medium One web portal. Let's get you started with:

  • Account setup and user configuration
  • Send data
  • Build a real-time workflow
  • Inspect results

Follow along with the Getting Started video tutorial below, or scroll down for a text version of the instructions.


Step 1: Create an API Key

  • An API Key determines the privilege and visibility within the Medium One platform.
  • On the Setup → Manage API Keys page, click Add New API Key. Enter a brief description and make sure Enabled is checked.

Note: This is your newly created API Key that you will need to send events in a subsequent step.



Step 2: Create an API Basic User

  • Now you will need to create an API Basic User to send data into the cloud.
  • An API Basic User can create/retrieve/update/delete event data, retrieve/update their own user information, and fetch processed events.
  • In this tutorial, you will create the API Basic User through our web portal. They can also be created via REST API. For more details on how to do this, see REST API section.
  • On the Setup → Manage Users page, click Add New User and fill out the username and password info.
  • After creating a new user, you should now see it appear on the Manage Users page. You may have to refresh to see this change.
  • Save the credentials for the API Basic User as we will be using them in our next step.



Step 3: Send a Sample Event

  • Let's use CURL commands to send data created on behalf of the API Basic User.
  • If you are using a Windows computer, you will need to download CURL with SSL here.
    • You will also need the recent CA Certificates Do not skip this step
    • Download cacert.pem here. Rename it 'curl-ca-bbundle.crt' and place it in the same folder as curl.exe



  • First, we will need to login using CURL before sending the event.

    • In the CURL command below, replace < API Basic User > with the username you created earlier from Step 2. Be sure to keep the quotes around the username.
    • Replace < password > with the password for that API Basic User. Be sure to keep the quotes around the password.
    • Replace < API Key > with the API Key that you created in Step 1. Be sure to keep the quotes around the API Key.

    • Mac OS X or Linux

      curl -c cookies.txt -X POST -k -H 'Content-Type: application/json' -H 'Accept: application/json' -i 'https://api-sandbox.mediumone.com/v2/login' -d '{"login_id": "< API Basic User >", "password": "< password >", "api_key": "< API Key >"}'
    • Windows - keep all of the backslashes ( \ ) before the quotes.

      curl -c cookies.txt -X POST -k -H "Content-Type: application/json" -H "Accept: application/json" -i "https://api-sandbox.mediumone.com/v2/login" -d "{\"login_id\": \"< API Basic User >\", \"password\": \"< password >\", \"api_key\": \"< API Key >\"}"
    • If your login was successful, you will see a 200 message:

      HTTP/1.1 200 OK
      Server: nginx/1.4.6 (Ubuntu)
      Date: Tue, 01 Mar 2016 18:45:49 GMT
      Content-Type: application/json; charset=UTF-8
      Content-Length: 4
      Connection: keep-alive
    • Note: Using CURL, You will only need to login once every 24 hours.


  • After logging in, you can now send an event to the cloud using that API Basic User.

    • In this tutorial, we will be sending "force_strength":67 and "level":"apprentice" in the same event. You can find the CURL commands for this below:

    • Mac OS X or Linux

      curl -b cookies.txt -X POST -k -H 'Content-Type: application/json' -i 'https://api-sandbox.mediumone.com/v2/events/raw/<API Basic User>' -d '{"observed_at":"2015-03-04T12:00:00.0-0700", "event_data": {"force_strength":67, "level":"apprentice"}}'
    • Windows - keep all of the backslashes ( \ ) before the quotes.
      curl -b cookies.txt -X POST -k -H "Content-Type: application/json" -i "https://api-sandbox.mediumone.com/v2/events/raw/<API Basic User>" -d "{\"observed_at\":\"2015-03-04T12:00:00.0-0700\", \"event_data\": {\"force_strength\":67, \"level\":\"apprentice\"}}"
    • You should see the 200 success message.


  • Now that you've sent in an event, we need to Activate the tags to use them in workflows.



Step 4: Activate Tags

  • Tags are what we use to classify the "keys". They are each sent with a "value"
  • In the event you just sent in:
    • The tags are "force_strength" and "level"
    • The values are 67 and "apprentice"
  • To find out more about Tags and Values, click here.


  • On the Config →Data Streams page, click Edit for the Raw Data Stream.
  • Select the Active checkbox for the two listed Tags (force_strength and level), and then Save Data Stream.



Step 5: Create a Workflow

  • Workflows are used to process data from events to generate a new event.
    • These new events created from workflows are called processed events .
  • On the Workflow Studio page, click Create to create a new Workflow.
  • Once you created the new Workflow, click the Tags & Triggers pane on the right toolbar, and under the "raw" dropdown, drag and drop force_strength and level tags onto the main canvas.
    • These are the triggers that signify what runs the Workflow.
  • Next, from the Modules pane, click on the "Foundation" dropdown then drag and drop the Base Python module onto the canvas.
    • Base Python is a programmatic module that you can code in Python to process your data.
  • From the Outputs panel, drag and drop the Processed Data Stream module onto the canvas.
    • This output module signifies that your Workflow will generate a processed event everytime it is triggered.
  • Double click on the Base Python module box, and expand the IONodes section and click Add Input. A new row should appear for input "in2". Click Save.
  • With two inputs, you can connect your "force_strength" and "level" triggers to the Base Python module.
  • Connect the input and output connectors (click and drag) to make the Workflow look something like this:
  • Double click the Base Python module again, and let's replace the default script and enter the following into the script textarea:
new_strength = IONode.get_input('in1')['event_data']['value'] + 28 
new_level = "jedi"
output = IONode.set_output('out1', {'new_strength': new_strength, 'new_level': new_level})
  • This script outputs a processed events with tags "new_strength" and "new_level" everytime the Workflow runs:
    • "new_strength" is created by taking input 1 (in this case it is the "force_strength" tag) and adding 28 to that value.
    • "new_level" is set as "jedi"

Your Base Python module should now look like this:

  • Save the code in the Base Python module.
  • Lastly, you want to Activate the Workflow to make it go live.
    • Go to the Revisions panel, and Activate the Workflow by clicking on the check icon under the most recent revision.
    • A green "activated" icon should appear next to the revision name.
    • Your workflow is now live and will be generating an output whenever it is triggered.
  • In summary, this Workflow will run every time a "force_strength" or "level" tag is sent into the cloud. It will generate a processed event with "new_strength" and "new_level". "new_stregth" will be 28 more than the value of the "force_strength" that was sent in. "new_level" will be "jedi".

Note:

  • The star icon means this revision is favorited and it cannot be deleted. Activated revisions must always be starred and cannot be unstarred until it is dectivated.
  • In this example, the “in2” input is actually not used, and we chose to use output tag names that are different from the inputs.
  • Learn more about the Workflow Studio here.



Step 6: Triggering the Workflow to Generate a Processed Event

  • Now that you've created your Workflow, it's time to trigger it to generate a processed event.
  • Using CURL, send in the same exact event that you sent in from Step 3 again.
  • Sending this new event will automatically be processed by your newly created Workflow and will generate a new processed event.



Step 7: View Tags of Processed Stream

  • Next, you will verify that the processed event is created and sent to the processed data stream:
  • Similarly to Step 4, on the Config → Data Streams page, click Edit for Processed Events.
  • You will find the two new tags (new_strength and new_level) listed and already selected Active, so no changes are necessary.

Note: Processed events are always automatically selected as Active.



Step 8: Visualize Processed Data on Dashboard

  • Now, you can view your newly processed data on the Dashboard.
  • First, click on Dashboard on the navigation side bar.
  • Select the Single User Table widget from the widget selector.
  • From the Select User dropdown, select the username you gave to the API Basic User.
  • Click the configuration gear icon , and select the 4 tags (force_strength, level, new_strength, and new_level). If you don't see all 4 tags, you may need to refresh your browser.
  • You should now see both the raw and processed events in the single user table.


Learn more about the Dashboard Widgets here.



Congratulations, you have now created a Workflow and processed a simple data event. Now you are ready to connect your data sources, and get intelligent about your data.

Any questions? Contact us here: support@mediumone.com