Allyn H

Making things, writing code, wasting time...

Page 2 of 3

The new Bungie authentication flow!

Hi there,

As of the 14th of December, Bungie have changed their authentication flow for the Destiny API. Bungie are moving away from the cookie based authentication flow, to the OAuth 2.0 flow.

The code in the previous blog posts will still work, but Bungie will soon be switching over fully to the new authentication system.

This is good news for anyone using the apps based on the Destiny API, as it is much more secure, offers the users more control over how their data is used, and what can controlled from external applications.

You can read about the changes from the official Bungie release here:

The new Bungie OAuth flow.

I’m currently creating the Python code for the new authentication flow. The code is all working but I’m taking a week off to spend Christmas with my family, so stay tuned for the blog post with updated code 🙂

Creating a Python app for Destiny – Part 5: Reading a characters inventory and vault contents.

Introduction:

In the previous sections I showed how to:

  1. Send a request to read Xurs inventory.
  2. Send a HTML formatted email with Xurs inventory.
  3. Log into Destiny and Bungie.net via your PSN account.
  4. Transfer items from your vault to your character and equip them.

In this section I’m going to lay the foundations of a full inventory management system. This section will detail how to read the list of items that are currently equipped on your character and how to read the contents of your vault.

I’ll also show how, by downloading the full Destiny Manifest, we can speed up our code and drastically reduce the amount of HTTP requests we need to send to Bungie.

Also, I like to say a big thanks to all of the people on Reddit, Twitter and Bungie.net for their help and interest 🙂

You can find me on Twitter here @Allyn_H_

The getCharacterInventory endpoint:

The BungieNetPlatform wiki for this endpoint can be found here. This endpoint will return a JSON response with a list of each of the equipped items for a given character ID. It works in a similar way to the Xur adviser endpoint (seen in parts 1 & 2). The request will return a series of encoded hashes, which we’ll then need to send as a series of separate requests to query the Destiny manifest, in order to decrypt this data into a human readable format.

  • This is a public endpoint – no need to be logged in, anyone can use this.
  • This a HTTP GET request – you don’t need to send a JSON package.
  • Your destinyMembershipId needs to be attached to the URL as a parameter – more info here.
  • Your characterId needs to be attached to the URL as a parameter – more info here.

Making the request:

First we build the endpoint URL and make the GET request:

base_url = "https://www.bungie.net/platform/Destiny/"
req_string = base_url + membershipType + "/Account/" + destinyMembershipId + "/Character/" + charId + "/Inventory"
res = session.get(req_string)

The request URL will look like this:

https://www.bungie.net/platform/Destiny/2/Account/4611686018436136301/Character/2305843009222916165/Inventory/

Parsing the response:

Next we need to parse through the multidimensional JSON response and pick out the relevant data:

for equipment in res.json()['Response']['data']['buckets']['Equippable']:
    for item in equipment['items']:
        print "itemHash is: \t\t", item['itemHash']
        print "itemInstanceId is: \t", item['itemInstanceId']
        hashReqString = base_url + "Manifest/6/" + str(item['itemHash'])
        res2 = requests.get(hashReqString, headers=HEADERS)
        item_name = res2.json()['Response']['data']['inventoryItem']['itemName']
        item_tier = res2.json()['Response']['data']['inventoryItem']['tierTypeName']
        item_type = res2.json()['Response']['data']['inventoryItem']['itemTypeName']
        item_icon = res2.json()['Response']['data']['inventoryItem']['icon']
        print "Item name is: " + item_name
        print "Item type is: " + item_tier + " " + item_type
        print "Item icon is: http://www.bungie.net" + item_icon

The above code will read through the large JSON response and print out the data we want. For each item in our inventory, we will print out the items name, its type, the tier type (common, legendary or exotic) its description and its icon. The dict item [‘Equippable’] is an array, so we need to loop through each piece of “equipment” and pick out each of the [‘items’] stored there.

The code will return something like this:

itemHash is: 1703777169
itemInstanceId is: 6917529081613012276
Item name is: 1000-Yard Stare
Item type is: Legendary Sniper Rifle
Item icon is: http://www.bungie.net/common/destiny_content/icons/c1c49acd0fd146d7b32184f23c64dfe5.jpg

The GetVault endpoint:

The BungieNetPlatform wiki for this endpoint can be found here. This endpoint also works in a similar way to the Xur adviser endpoint (seen in parts 1 & 2) and the getCharacterInventory endpoint, it will return a series of encoded hashes, which we’ll then need to send as a series of separate requests to query the Destiny manifest. If our vault was full, we currently have space for 288 items. This means we’ll need to make 289 (1 request to the vault + 288 requests to decrypt items ) requests to the Bungie servers to get all the data we want! Making that many requests takes way too much time and eats way too much data, I’ll explain how to remove these requests in a bit 🙂

  • This is a private endpoint – you need to be logged in, with a persistent HTTP session.
  • This is a GET request – you don’t need to send a JSON package.
  • Your destinyMembershipId needs to be attached to the URL as a parameter – more info here.

Making the request:

The code for this request will look like this:

getVault_url = base_url + membershipType + "/MyAccount/Vault/"
res = session.get(getVault_url, params={'accountId': destinyMembershipId})

Parsing the response:

Next we need to parse through the multidimensional JSON response and pick out the relevant data:

for bucket in vaultResult.json()['Response']['data']['buckets']:
    for item in bucket['items']:
        print item['itemHash']
        print item['itemInstanceId']
        hashReqString = base_url + "Manifest/6/" + str(item['itemHash'])
        res2 = requests.get(hashReqString, headers=HEADERS)
        myItem = item['itemHash']
        item_name = res2.json()['Response']['data']['inventoryItem']['itemName']
        item_tier = res2.json()['Response']['data']['inventoryItem']['tierTypeName']
        item_type = res2.json()['Response']['data']['inventoryItem']['itemTypeName']
        item_icon = res2.json()['Response']['data']['inventoryItem']['icon']
        print "Item name is: " + item_name
        print "Item type is: " + item_tier + " " + item_type
        print "Item icon is: http://www.bungie.net" + item_icon

The above code will read through the large JSON response and print out the data we want. For each item in our inventory, we will print out the items name, its type, the tier type (common, legendary or exotic) its description and its icon. The dict item [‘buckets’] is an array, so we need to loop through each “bucket” and pick out each of the [‘items’] stored there. The [‘buckets’] item is a storage category in your vault, such as Armour, Weapons, Consumables, Shaders, Emblems, etc. Each of the [‘items’] refers to the item stored in that vault space.

It’s also important to remember, we need to collect the “itemInstanceId” for each item, as this is required to transfer the item to and from the vault.

hashReqString = base_url + "Manifest/6/" + str(item['itemHash'])
res2 = requests.get(hashReqString, headers=HEADERS)

The above lines makes each of the requests to the manifest to decrypt the “itemsHash” into a human readable format. This request is made for each of the items in our vault. This is the real time consuming part of the program.

Downloading the manifest:

So we’ve figured out how to find our vault contents and list the equipped items on our characters. We’ve also seen, for decrypting and reading the contents of our vault, we need to make 289 requests (more if Bungie ever decide to increase the vault capacity). This takes a huge chunk of time. I measured the amount of time it took to read my inventory and to send 267 requests (I had 266 items in my inventory), the script took 2 mins and 43 seconds to finish!

So in order to speed things up, I downloaded the Destiny Manifest database onto my local computer, the code and directions can be found here on the destiny devs page.

The code can be downloaded here.

Once downloaded you can execute the code like so:

python manifest_destiny.py

The output looks something like this:

Output of the script used to download the Destiny Manifest.

Output of the script used to download the Destiny Manifest.

This will download the full Manifest and store it in a Pickle file. This is also my first time using Pickle files, so I assume this is done as it’s easier to get the data from a Pickle file than an SQL database.

Here’s what they look like:

Copies of the Destiny Manifest and the Manifest as a Pickle file.

Copies of the Destiny Manifest and the Manifest as a Pickle file.

The files are 53MB and 71MB, so they’re big enough. The file sizes can be reduced by commenting out any of the Manifest items you don’t need – for example I downloaded second copy of the Manifest with only the “itemHash” data included.

Here’s how I edited the code:

Changing the code to download only the itemHash data.

Changing the code to download only the itemHash data.

Here are the file sizes:

Copies of the itemHash manifest and Pickle file.

Copies of the itemHash manifest and Pickle file.

We can see here, the Pickle file we need has reduced from 71MB to 37MB. This would also further reduce the run time of our programs. As memory isn’t really an issue for me I will be sticking to the full Manifest.

Reading the vault contents using the manifest:

Now that we have saved the Manifest as a Pickle file, we can pull the required data directly from this file instead of needing to make hundreds of HTTP requests every time we want to read our inventory or our vault contents.

To import the data from our Pickle file we use the following code:

import pickle

with open('manifest.pickle', 'rb') as data:
    all_data = pickle.load(data)

Now the object all_data will contain the entire Manifest contents.

Previously we made requests to the Manifest via a HTTP request:

hashReqString = base_url + "Manifest/6/" + str(item['itemHash'])
res2 = requests.get(hashReqString, headers=HEADERS)
myItem = item['itemHash']
item_name = res2.json()['Response']['data']['inventoryItem']['itemName']

Where we pass the parameter “6”, which refers to the definition type “InventoryItem” and the specific “itemHash”.
Now we can read the item information from the nested dict object stored in all_data:

inventoryItem = all_data['DestinyInventoryItemDefinition'][item['itemHash']]
item_name = inventoryItem['itemName']

Updating the code to pull from our local Manifest copy:

Here’s how I modified the code used to parse our vault to pull the data directly from our manifest.pickle file instead of making a HTTP request:

for bucket in vaultResult.json()['Response']['data']['buckets']:
    for item in bucket['items']:
        weapon_list[array_size]['itemReferenceHash'] = item['itemHash']
        weapon_list[array_size]['itemId'] = item['itemInstanceId']
        inventoryItem = all_data['DestinyInventoryItemDefinition'][item['itemHash']]
        item_name = inventoryItem['itemName']
        item_tier = inventoryItem['tierTypeName']
        item_type = inventoryItem['itemTypeName']
        item_icon = inventoryItem['icon']
        print "Item name is: " + item_name
        print "Item type is: " + item_tier + " " + item_type
        print "Item icon is: http://www.bungie.net" + item_icon

Run time improvements:

We can see from the table below that downloading the Manifest and removing the need for ~290 additional HTTP requests can drop the run time from 220 seconds down to 16 seconds. A lot of 16 seconds seems to be from parsing the Pickle file, which again could be optimised by either using the raw SQL data or storing it in some other format. For now though the convenience of using the Pickle file far out weighs the downsides.

Manifest Type:

Run time:

Multiple HTTP requests to Bungie Manifest 220.0 seconds
manifest.pickle – full download 22.3 seconds
item.pickle – only item 16.2 seconds

Viewing the vault contents as a HTML file:

As a quick and easy way of outputting the vault contents into a HTML file, I used the same template file and code format as used in the guide for emailing Xurs inventory.

Here are the steps:

  • Append the HTML header to the top of the my_html string.
  • Modify the code used to parse the vault contents to output a HTML formatted string.
  • Append the HTML footer to the bottom of the my_html string.
  • Concatenate the header, vault contents and footer HTML strings into the my_html string and output this to a file called vault_contents.html.

Parsing the response as HTML:

for bucket in vaultResult.json()['Response']['data']['buckets']:
    for item in bucket['items']:
        inventoryItem = all_data['DestinyInventoryItemDefinition'][item['itemHash']]
        item_name = inventoryItem['itemName']
        item_tier = inventoryItem['tierTypeName']
        item_type = inventoryItem['itemTypeName']
        item_icon = "http://www.bungie.net/" + inventoryItem['icon']
        print "Item name is: " + item_name
        array_size += 1
        print "Item is: " + item_name
        print "Item type is: " + item_tier + " " + item_type + "\n"
        my_html = my_html + "\t\t<div class=\"col-md-4\">\n"
        my_html = my_html + "\t\t\t<div class=\"thumbnail\">\n"
        my_html = my_html + "\t\t\t\t<a href=\"" + item_icon + "\">\n"
        my_html = my_html + "\t\t\t\t<img src=\"" + item_icon + "\">\n"
        my_html = my_html + "\t\t\t\t</a>\n"
        my_html = my_html + "\t\t\t\t<h3>" + item_name + "</h3>\n"
        my_html = my_html + "\t\t\t\t<p>" + item_tier + " " + item_type + "</p>\n"
        my_html = my_html + "\t\t\t</div>\n"
        my_html = my_html + "\t\t</div>\n"

Here’s what the output of the code looks like:

Even have the VoG boots. #Y1RaidStruggles

Vault contents as formatted HTML.

Running the code:

The full set of Python code can also be found on my GitHub page here: https://github.com/AllynH/Destiny_Read_Characters_Inventory_Vault

As there are a few files you’ll need to copy it from there,. In order to make the code work for you – you’ll need to input your username, password, api_key, destinyMembershipId and characterId in the Header_file.py.

Here are the values you’ll need to change:

# PSN Username:
username = emailaddr
password = mypassword

# Destiny API X-Key:
API_KEY = ""

# Destiny parameters:
membershipType = "2" # PS4 = 2
destinyMembershipId = ""

characterId = ""

You can then run the code like so:

> python Read_Inventory_and_Vault.py

As always, I’ll try to keep the GitHub repo up to date with any changes I make.

Next steps:

The code as it is, does what we want, reads the contents of our vault and prints it to a HTML file, but it’s not really a fully functioning inventory management system.

To create a fully working web application the code will need to be built into a web framework, luckily for me I’ve been reading up on the Python Flask web framework for the last few weeks 🙂

Creating a Python app for Destiny – Part 4: Transferring and equipping items.

Introduction:

In the previous sections I showed how to:

  1. Send a request to read Xurs inventory.
  2. Send a HTML formatted email with Xurs inventory.
  3. Log into Destiny and Bungie.net via your PSN account.

I’m now going to build on the previously created code to create an app that can transfer an item to and from the vault, once the item has been transferred – we can then equip it.

If you’ve been reading along with me up to this point, you should understand all of this, if not – now is a good time to check out my previous posts.

Here’s the code working (sorry about the camera shaking, it was made using my phone ):

Destiny the game, Equipping an Item with the Destiny API. from Allyn H on Vimeo.

Also, I like to say a big thanks to all of the people on Reddit, Twitter and Bungie.net for their help and interest 🙂

You can find me on Twitter here @Allyn_H_

The transferItem endpoint:

The Destiny API uses web server endpoints to execute commands, the endpoint used to transfer items is: https://www.bungie.net/platform/Destiny/transferItem/

The BungieNetPlatform wiki for this endpoint can be found here. We can see this method is very different from the Xur advisors endpoint we used in the previous examples.

Here are the main differences:

  • We still need to use the X-API key in our header.
  • We also have to attach our x-csrf token in our header.
  • This is a private endpoint – we need to be logged in to use this method.
    • We need to attach the required cookies to our POST request.
  • This is a POST method – we will need to send data with this request.
    • We need to send the data as a JSON packet.
  • The response will only tell us if we have been successful or not – the Xur advisors response was >1000 lines of text.

The code for storing the transferItem endpoint variable will look like this:

base_url = "https://www.bungie.net/platform/Destiny/"
req_string = base_url + "TransferItem/"

Important note: If the trailing “/” is missing from the “transferItem/” endpoint – the request will fail. I’ve spent way longer than I’m willing to admit debugging that issue :/

Creating the POST request payload:

The “transferItem” endpoint takes a JSON payload as an input. This JSON payload contains a lot of information. So lets take a look at how it’s put together.

The data gathered to populate this payload was found by looking at items I had equipped using the “GetCharacterInventory” endpoint, you can find more info here. This will list the relevant information for all of your equipped items, let me know if you want to see the code for this.

Name

Description

membershipType

This will be 2 for PSN and 1 for Xbox live.

itemReferenceHash

This is the “itemHash” number from our previous examples. This refers to the generic item for example: 1000-Yard Stare.

itemId

This is a unique item number relating to a specific item, that you own. For example: Allyn’s 385 light level 1000-Yard Stare with the Ambush Scope, Quickdraw and Firefly.

stackSize

How many items to transfer. Should be “1” for equipable items.

characterId

The characterId the item is being moved to or from. You can find the characterId here.

transferToVault

Move the item to or from the vault; true or false

Here’s how this method expects the JSON data to be formatted:

This is the BungieNetPlatform example for the transferItem POST payload

This is the BungieNetPlatform example for the transferItem POST payload

The good news here is it’s pretty easy to put this together in Python by creating a Dictionary object.

Making a POST request:

We’ve already looked at making HTTP GET requests with Python, but the Python Requests library also makes it really easy to make a POST request.

You can view the Requests documentation here. Here’s the quick example of how to send some data formatted as a Python dictionary:

Python Requests POST example

Python Requests POST example

Or the Requests library will also automatically encode a Python dict into a JSON object for you, when you use the “json” parameter:

Python Requests POST JSON example

Python Requests POST JSON example

Now that we know how to create a Python POST request and what data the transferItem endpoint are looking for, lets turn that into some code!

Moving an item from your vault to your inventory:

As mentioned above – the data gathered to populate this payload was found by looking at items I had equipped by using the “GetCharacterInventory” endpoint, you can find more info here. I’m planning on putting together a blog post on this too but let me know if you’d like to see the code for this.

First we create our Python dictionary object:

text_payload = {
    "membershipType": 2,
    "itemReferenceHash": 1519376148, # The Ram
    "itemId": 6917529085991104887, # The Ram
    "characterId": characterId_Warlock,
    "stackSize": 1,
    "transferToVault": False
}

Next, we convert the dictionary object into a JSON object:

payload = json.dumps(text_payload)

Now we can make the actual POST request to the Destiny servers to transfer our item.

base_url = "https://www.bungie.net/platform/Destiny/"
req_string = base_url + "TransferItem/"
res = session.post(req_string, data=payload)

The JSON response should look like this:

{
    "ThrottleSeconds": 0, 
    "ErrorCode": 1, 
    "ErrorStatus": "Success", 
    "Message": "Ok", 
    "Response": 0, 
    "MessageData": {}
}

If the item was not found in your vault, you’ll get a response like this:

{
    "ThrottleSeconds": 0, 
    "ErrorCode": 1623, 
    "ErrorStatus": "DestinyItemNotFound", 
    "Message": "The item requested was not found.", 
    "Response": 0, 
    "MessageData": {}
}

For any other issues, for example if you were to omit the trailing “/” from “https://www.bungie.net/platform/Destiny/TransferItem/” URL, the request would fail but you wouldn’t receive a JSON response, instead your code would error and you’d receive a HTTP 405 status code. You can find the status code by using the following code:

print res.status_code
405

If you see something like that coming up, you’ll need to review and fix your code.

Equipping an item from your inventory:

Now that the item is in our characters inventory – we can equip the item on that character.

The endpoint used to equip an item from your inventory is: https://www.bungie.net/platform/Destiny/EquipItem/ you can find more information about this endpoint here.

Just a note: this can only be done when in a social space or in orbit (otherwise it’d be really easy to cheat in PvP – can you imagine if you could do this in Trials???). Otherwise the action will fail and return an error message.

Creating the POST request payload:

The POST request to equip an item is smaller and we already have the details of the item we want to equip.

text_equip_payload = {
    "membershipType": 2,
    "itemId": 6917529085991104887, # The Ram
    "characterId": characterId_Warlock
}
equip_payload = json.dumps(text_equip_payload)

 

equip_url = base_url + "EquipItem/"
res = session.post(equip_url, data=equip_payload)

Again, if the item was not found, you’ll get a response like this:

{
    "ThrottleSeconds": 0, 
    "ErrorCode": 1623, 
    "ErrorStatus": "DestinyItemNotFound", 
    "Message": "The item requested was not found.", 
    "Response": 0, 
    "MessageData": {}
}

If you’re trying to equip an exotic and you already have one equipped, you’ll see an error message like this:

{
    "ThrottleSeconds": 0, 
    "ErrorCode": 1641, 
    "ErrorStatus": "DestinyItemUniqueEquipRestricted", 
    "Message": "You can only have one item of this type equipped.", 
    "Response": 0, 
    "MessageData": {}
}

Running the code:

Here is the full set of Python code, this can be copied into a file called “equipItem.py” and executed from the command prompt like so:

> python equipItem.py

The code can also be found on my GitHub page here: https://github.com/AllynH/Destiny_Equip_Item

As always, I’ll try to keep the GitHub repo up to date with any changes I make.

Creating a Python app for Destiny – Part 3: Logging in to Bungie.net and authenticating with PSN

Introduction:

In the previous sections I showed how to:

  1. Send a request to read Xurs inventory.
  2. Send a HTML formatted email with Xurs inventory.

I want to build on the previously created code to create an app that can transfer an item to and from the vault, and equip items.

In order to do that, our code will need to log in to Bungie.net and authenticate the account with PSN.

Logging in to Bungie.net and authenticating with PSN:

We are going to use the Python “Requests” package to login to Bungie.net by using our PSN account details and OAuth 2.0 to authenticate our connection with PlayStation Network.,

The good people at BungieNetPlatform have put together some guides on how to connect with Bungie.net, get authenticated with PSN (or Xbox Live – but I’m on PS4) and grab the required cookies. For this example, I used the code provided by Quantum Ascend here.

You can also see his step by step instructions here.

Here are the steps to this section:

  1. Sign in on Bungie.net via PSN – this will redirect you to the PSN sign in page.
  2. Grab our PSN session ID.
  3. Login to PSN (via OAuth) using our PSN username, password and adding our session ID as a cookie.
  4. Receive PSN a unique sign in URL and updated JSESSIONID.
  5. Request PSN  X-NP-GRANT-CODE, using updated JSESSIONID.
  6. Sign in to Bungie.net by adding the grant code to our original URL.
  7. Grab our Bungie.net authentication cookies.

Here is a flow chart detailing these steps:

Future War Cult colours - representing!

Bungie.net Sign-in flow chart

Request 1 is done in this way, to accommodate both Playstation and Xbox accounts to log in – however as I only have a Playstation 4, I’m not working on the Xbox live sign in, you can find code for that in the BungieNetPlatform guide here.

request1 = requests.get(BUNGIE_SIGNIN_URI, allow_redirects=True)
jsessionid0 = request1.history[1].cookies["JSESSIONID"]
params = urlparse(request1.url).query
params64 = b64encode(params)

Request 2 sends a POST request to the PSN sign in page. Our log in credentials are passed in a dictionary format, these are then form-encoded (by the Requests package as a HTML form) when the request is made. We also create a cookie with the JESSIONID we received from Request 1.

The response from Request 2, returns an updated JSESSIONID, also stored in a cookie – we save this updated value. The if statement checks for an authentication error being returned – this confirms our log in credentials were correct and no errors were returned.

request2 = requests.post(PSN_OAUTH_URI, data={"j_username": username, "j_password": password, "params": params64}, cookies={"JSESSIONID": jsessionid0}, allow_redirects=False)
if "authentication_error" in request2.headers["location"]:
    logger.warning("Invalid credentials")
jsessionid1 = request2.cookies["JSESSIONID"]

Request 3 sends a GET request to the returned PSN OAtuh sign in URL, adding the updated JSESSION ID, to the header. This will give us our x-np-grant-code.

request3 = requests.get(request2.headers["location"], allow_redirects=False, cookies={"JSESSIONID": jsessionid1})
grant_code = request3.headers["x-np-grant-code"]

The PSN OAtuh sign in URL will look something like this:

https://auth.api.sonyentertainmentnetwork.com/2.0/oauth/authorize?response_type=code&client_id=78xxx&redirect_uri=https%3a%2f%2fwww.bungie.net%2fen%2fUser%2fSignIn%2fPsnid&scope=psn:s2s&request_locale=en

Request 4 makes the final request to the Bungie.net sign in page, attaching the x-np-grant-code to the URL. The “params” function in the requests library attaches this code to the URL.

request4 = requests.get(BUNGIE_SIGNIN_URI, params={"code": grant_code})

The Bungie.net sign in URL with the x-np-grant-code attached should look something like this:

https://www.bungie.net/en/User/SignIn/Psnid?code=Nxxxh

Now that we have authorised our Bungie.net account with PSN, we can create a persistent session and send multiple requests.

Creating a persistent HTTP Session:

A persistent HTTP session is used to keep our HTTP connection alive allowing us to make multiple requests without the need to sign in and authenticate each time. This means we will only need to authorise our account once and can make multiple requests – so long as we attach the relevant authorisation data. This authorisation data is stored in the cookies and header data we send in our requests. The python requests package has a “Session” object, used for just this thing!

To create a HTTP session, we need to do 2 things:

  1. Send the required HTTP header data:
    • X-API-Key – the Application Programming Interface key we got from registering at Bungie.net.
    • x-csrf – our Cross Site Request Forgery protection token, received from Bungie.net after we have authenticated out app with PSN.
  2. Attach the required cookies with the correct, authenticated data:
    • bungled – received from Bungie.net after we have authenticated out app with PSN (This is also our x-csrf token).
    • bungleatk – received from Bungie.net after we have authenticated out app with PSN.
    • bungledid – received from Bungie.net after we have authenticated out app with PSN.

Here’s what that looks like when translated into Python code – first we create a requests Session object:

session = requests.Session()

Next, we add our X-API-KEY and x-csrf token to the session header:

session.headers["X-API-Key"] = API_KEY
session.headers["x-csrf"] = request4.cookies["bungled"]

Then we create our Cookies and attach them to the requests session object:

session.cookies.update(
 {
    "bungleatk": request4.cookies["bungleatk"], 
    "bungled": request4.cookies["bungled"], 
    "bungledid": request4.cookies["bungledid"]
 })

That’s it! We’re done – our app can now log into Destiny via PSN. This will allow us to use all of the private endpoints provided by the API and do lots of cool stuff, such as transferring items, equipping items, locking items, etc.
I’ll build on this code again in my next blog post.

Running the code:

Here is the full set of Python code, this can be copied into a file called “PSN_login.py”, in the same directory as your own code, and implemented like so:

from PSN_login import login

username = emailaddr
password = mypassword
api_key = API_KEY

# Log in via PSN and create our persistant HTTP session: 
session = requests.Session()
session = login(username, password, api_key)

Here’s the link to the code on my GitHub account:

https://github.com/AllynH/Destiny_Equip_Item/blob/master/PSN_login.py

Here’s the GitHub Gist:

Creating a Python app for Destiny – Part 2: Emailing Xurs inventory.

Introduction:

In the previous tutorial, I showed how you could use Python to send a request using the Destiny API, to the Bungie servers and how to decrypt the JSON reply. If you haven’t read the previous tutorial, it’s right here. I’m going to gloss over how to write the HTML, as there are plenty of really good online resources for creating and styling websites, checkout codecademy if you’re interested in a free guide.

In this tutorial I will continue on with the program we have created, to:

  1. Change our Get_Xur_inventory.py program so it writes the output as a HTML file.
  2. Have the program send this HTML, via Gmail, directly to our  email.

By writing our output as HTML we have a lot more control over the design and formatting of the email. Using HTML will also allow us to directly embed the item pictures into out email, from the URL links provided in the JSON response from Xur.

The hard part of the code is already done, with a few minor tweaks, we can have our program output Xurs inventory into a HTML formatted email.

 

Outputting Xurs inventory as HTML:

In order to create our output HTML, we are going to use a template HTML file to store as much of the generic HTML as possible, and change all of the print statements in our code to output HTML formatted data.

HTML file structure is split into 3 main parts, such as HTML version information, <head>, and <body> tags. We are going to take advantage of that and save the HTML version information and <head> section in a template HTML file, which we can reuse every time we run the script. This would allow us to update any of the generic HTML code separately – for example if you wanted to change some styling or CSS information, without messing with your Python code

We can then use our script to generate only the HTML needed to display the items from Xurs inventory, this will change every week so needs to be generated every time we run our script.

Opening our template HTML file:

The following code will open a file called “template.html” for reading, and save all of the contents into a string object called “my_html”, then close the file.

template_file = open('template.html', "r")
my_html = template_file.read() template_file.close()

Outputting HTML from our Python code:

Below is an example of how we’d like to format the HTML code, essentially we wrap it in a couple of <div>’s and display the image and text.

<div class="col-md-4">
 <div class="thumbnail">
 <a href="item_url">
 <img src="item_url">
 </a>
 <h3>Item name: item_name</h3>
 <p>Item type is: item_type<p>
 <p>Description: item_description<p>
 </div>
</div>

Here’s what the corresponding Python code looks like:

my_html = my_html + "<div class=\"col-md-4\">\n"
my_html = my_html + "\t<div class=\"thumbnail\">\n"
my_html = my_html + "\t\t<a href=\"" + item_url + "\">\n"
my_html = my_html + "\t\t<img src=\"" + item_url + "\">\n"
my_html = my_html + "\t\t</a>\n"
my_html = my_html + "\t\t<h3>" + item_name + "</h3>\n"
my_html = my_html + "\t\t<p>" + item_type + "<p>\n"
my_html = my_html + "\t\t<p>" + item_description + "<p>\n"
my_html = my_html + "\t</div>\n"
my_html = my_html + "</div>\n"

We’ve wrapped each of the lines of HTML code in a write statement. We’ve already populated all of the item_url, item_name, item_type and item_description variables with our code.

So here’s what each line is doing. In this example we are creating a <h3> heading, with the text from our “item_name”. “\t” creates a tab and “\n” moves to a new line. All of this text is concatenated into one string, which we add on to the end of the “my_html” string object.

my_html = my_html + "\t\t<h3>" + item_name + "</h3>\n"

Closing the HTML:

Now that we have created and populated <div>’s with each of Xurs items, we can add the code to close the HTML </body> and </html> tags.

my_html = my_html + "\t\t</div> <!-- row -->\n"
my_html = my_html + "\t</div> <!-- container -->\n"
my_html = my_html + "</div> <!-- inventory-container -->\n"
my_html = my_html + "</body>\n"
my_html = my_html + "</html>\n"

The “my_html” string will store all of the HTML code used in the body of the email we send.

Sending an email with Python:

Python comes installed with a number of really useful libraries, including the smtplib (Simple Mail Transfer Protocol library) and the MIMEMultipart and MIMEText (Multipurpose Internet Mail Extensions, allows text, images and other options) libraries.

For this section I referenced two tutorials, here Nael Shaib shows how to make a basic email program, and here Darren Massena shows how to create a HTML formatted email and attach external pictures.

Some important notes:

Google recently changed their security requirements, so to use this program, you will need to change your Google settings to allow less secure apps. I needed to do the following:

  1. Click here to allow less secure apps. Google will still reject access to your account, until you authorise it.
  2. Click here to authorise your app. This will authorise your app access to your Google account.
  3. Gmail and some other mail clients do not support CSS styling, so if you wanted to convert some existing CSS to inline HTML, you could use a tool like this. That being said, my iPad and Android phone both display mails with full CSS – so I’ve included some CSS styling in my code 🙂
Enable access for less secure apps.

Enable access for less secure apps.

Alright, that’s enough talk – lets get to coding!

Importing the required libraries:

First lets import the required libraries:

import smtplib
from email.MIMEMultipart import MIMEMultipart
from email.MIMEText import MIMEText

Next we’ll set up our email address parameters, enter your details as follows.

# Mail parameters:
fromaddr = "TO_ADDRESS"
toaddr = "FROM_ADDRESS"
password = "GMAIL_PASSWORD"

Next, lets create the email header information:

# Compose mail: 
msgRoot = MIMEMultipart() 
msgRoot['From'] = fromaddr 
msgRoot['To'] = toaddr 
msgRoot['Subject'] = "Xurs Inventory." 
msgRoot.preamble = "This is a multipart message in MIME format." 

The line:

msgRoot = MIMEMultipart()

creates an email MIMEMultipart object, we then set the [‘To’], [‘From’] and [‘Subject’] parameters of the email on the following 3 lines.

Creating the email body:

We’ve already populated the HTML into a string object called “my_html”.

The “my_html” string object is added to the body of the email with the following commands, the 2nd parameter  passed to the MIMEText object sets the email type as HTML:

msgText = MIMEText(my_html, 'html')
msgAlternative.attach(msgText)

The following commands create an SMTP mail object, connect to the Gmail SMTP server on port 587.

server = smtplib.SMTP('smtp.gmail.com', 587)
server.ehlo()
server.starttls()

The next line actually logs into our Gmail account, we pass our Gmail address and password as parameters to this.

server.login(fromaddr, password)

The next 2 lines actually send the email,

server.sendmail(fromaddr, toaddr, msgRoot.as_string())
server.quit()

Running the code:

Here is the full set of Python code, this can be copied into a file called “Get_Xur_inventory_email.py” and executed from the command prompt like so:

> python Get_Xur_inventory_email.py

The code can also be found on my GitHub page here: https://github.com/AllynH/Destiny_Get_Xur_inventory_email

As always, I’ll try to keep the GitHub repo up to date with any changes I make.

Creating a Python App for the Destiny API

Introduction:

I’m a huge fan of Bungie’s Destiny game, and have been playing it since launch. In Destiny there is a vendor, called Xur, who visits the game each week and sells exotic weapons and armour. Xur is only available from Friday to Sunday mornings, and appears in a different location each week – because of this he can be quite hard to track down. As with most people – I’m at work every Friday and usually busy with family time over the weekend, so I’m often left in a situation where I miss Xur or don’t have the time to log in and see what he’s selling.

This just wasn’t good enough, I couldn’t risk missing out on the next Gjallarhorn! So let’s built an app to find Xurs inventory each week.

 

API registration:

In order to build an app using the Bungie API, you need to create a Bungie.net account, and register as a developer. This only takes a few minutes, to register as a developer, follow this link: https://www.bungie.net/en/User/API

This will give you your unique X-API-Key. For every request we make to the Bungie servers, we need to send this API key in the HTTP header of the request.

 

App flow chart:

Finding Xurs inventory isn’t as straight-forward as expected. Bungie have implemented a method for reading Xurs inventory – but it returns the data in an encrypted format, as an item hash.

The reason behind this is, even items of the same type can have different perks. For example, each gun can have a different scope type, different perks, and different barrel modifications. So even 2 of the same guns can be completely different.

Because of this, we’ll need to send multiple requests to the Bungie servers and read multiple replies.

FWC representing!

Flow chart for the app.

 

Python HTTP requests:

Requests is a Python HTTP library, if you’re not familiar with it, start here. You may need to install it on your computer, If you have PIP installed on your computer requests can be installed by typing:

pip install requests

 

Sending our request to Xurs advisors page:

Our first HTTP request will be to Xurs Advisors page, here is the link https://www.bungie.net/Platform/Destiny/Advisors/Xur/

The X-API-Key is passed as a parameter with every HTTP request made to the Bungie servers:

HEADERS = {"X-API-Key":'MY-X-API-Key'}

 

Here is how we make the request:

xur_url = "https://www.bungie.net/Platform/Destiny/Advisors/Xur/"
print "\n\n\nConnecting to Bungie: " + xur_url + "\n"
print "Fetching data for: Xur's Inventory!"
res = requests.get(xur_url, headers=HEADERS)

It’s as simple as that.

So what are we doing? The line:

res = requests.get(xur_url, headers=HEADERS)

Makes a HTTP request to the URL stored in the “xur_url” string, the X-API-Key is also added to the request in the “headers” dictionary object.

Now the object “res” contains the JSON response received from the Bungie servers.

 

Parsing the JSON response:

The JSON response received from the request is pretty big, in my case it contained 1018 lines of text! So what do we do with this data? First things first, we need to know if our request was received and processed correctly. The JSON object contains a key called “ErrorStatus”, this key is used to store status of the request, so lets print the value of this key:

error_stat = res.json()['ErrorStatus']
print "Error status: " + error_stat + "\n"

Here is what the  code outputs:

Successful connection - whoop whoop!

Successful connection.

Here is an example of what the code would output if there was a successful request but Xur was not available (he’s only available from Friday to Sunday):

Office hours are Friday to Sunday morning!

Connection is successful but Xur is nowhere to be found.

 

Parsing the multidimensional JSON response:

Looking through the “res” JSON object, we can see a key called “itemHash”, located in res[‘Response’][‘data’][‘saleItemCategories’][‘saleItem’][‘item’][‘itemHash’], shown on line 23. This is the location of the encoded item details we are looking for!

Here is where things get tricky… JSON data is used to represent key-value pairs, called dictionaries in Python. However the item “saleItemCategories”, shown on line 11 – itself contains a series of key-value pairs. Also the item “saleItems”, shown on line 14, is also a dictionary.

Our JSON object contains nested dictionaries, also known as a multidimensional array.

Lots and lots of data!

Decoding the JSON response for Xurs inventory.

 

So in order to make a list of each of the “itemHash” values, we need a for loop to iterate through each of the “saleItemCategories” and another for loop to iterate through each of the “saleItems”.

for saleItem in res.json()['Response']['data']['saleItemCategories']:
	mysaleItems = saleItem['saleItems']
	for myItem in mysaleItems:
		hashID = str(myItem['item']['itemHash'])

 

Request 2 – decoding the item hash:

Now that we have a list of our itemHash’s – we need to make send a request to the Bungie Destiny Manifest page for each of the hashes.

This request takes the form of: http://www.bungie.net/Platform/Destiny/Manifest/{type}/{id}/

Where the {id} is the itemHash we just took from Xurs inventory.

A list of the hash {type}’s can be found here: http://bungienetplatform.wikia.com/wiki/DestinyDefinitionType but for this example, we know {type} will be “6”, as we are searching for an “InventoryItem”.

We can add on another request to our nested loops above:

base_url = "https://www.bungie.net/platform/Destiny/"
for saleItem in res.json()['Response']['data']['saleItemCategories']:
	mysaleItems = saleItem['saleItems']
	for myItem in mysaleItems:
		hashID = str(myItem['item']['itemHash'])
		hashReqString = base_url + "Manifest/" + hashType + "/" + hashID
		res = requests.get(hashReqString, headers=HEADERS)
		item_name = res.json()['Response']['data']['inventoryItem']['itemName']
		item_type = res.json()['Response']['data']['inventoryItem']['itemTypeName']
		item_tier = res.json()['Response']['data']['inventoryItem']['tierTypeName']
		print "Item is: " + item_name
		print "Item type is: " + item_tier + " " + item_type + "\n"

Here is what the output of our code looks like:

Here's what Xur is selling today.

Here’s what Xur is selling today.

And when we checkout Xurs inventory on the Bungie.net vendors page, here’s what we see:

Xurs inventory, taken from the Bungie.net vendors page.

Xurs inventory, taken from the Bungie.net vendors page.

Yes! Our Python app can now send requests to Bungie and print the contents of Xurs inventory. (Also, if you haven’t got The Ram – get it)

There’s lots of cool stuff sent in the JSON objects, including hyperlinks to the item images and details on the price of each item. For now, this is good enough for me.

Running the code:

Here is our full set code,  we can copy this into a file called “Get_Xur_inventory.py” and execute it from the command prompt like so:

> python Get_Xur_inventory.py

The code can also be found on my GitHub page here: https://github.com/AllynH/Destiny_Get_Xur_inventory

I’ll keep the repo up to date with any improvements or changes.

from lxml import html
import requests
import json

# Uncomment this line to print JSON output to a file:
#f = open('output.txt', 'w')

HEADERS = {"X-API-Key":'YOUR-X-API-Key'}

base_url = "https://www.bungie.net/platform/Destiny/"
xur_url = "https://www.bungie.net/Platform/Destiny/Advisors/Xur/"
hashType = "6"

# Send the request and store the result in res:
print "\n\n\nConnecting to Bungie: " + xur_url + "\n"
print "Fetching data for: Xur's Inventory!"
res = requests.get(xur_url, headers=HEADERS)

# Print the error status:
error_stat = res.json()['ErrorStatus']
print "Error status: " + error_stat + "\n"

# Uncomment this line to print JSON output to a file:
#f.write(json.dumps(res.json(), indent=4))

print "##################################################"
print "## Printing Xur's inventory:"
print "##################################################"

for saleItem in res.json()['Response']['data']['saleItemCategories']:
	mysaleItems = saleItem['saleItems']
	for myItem in mysaleItems:
		hashID = str(myItem['item']['itemHash'])
		hashReqString = base_url + "Manifest/" + hashType + "/" + hashID
		res = requests.get(hashReqString, headers=HEADERS)
		item_name = res.json()['Response']['data']['inventoryItem']['itemName']
		print "Item is: " + item_name
		item_type = res.json()['Response']['data']['inventoryItem']['itemTypeName']
		item_tier = res.json()['Response']['data']['inventoryItem']['tierTypeName']
		print "Item type is: " + item_tier + " " + item_type + "\n"
		

 

Here are a few ideas for the next steps:

  1. Add e-mail function, so the script mails me automatically when Xur lands.
  2. Create a web server, so we can access the app from a web page.
  3. Find Xurs location and add it to the output.
  4. Play some Trials of Osiris and get ready for the Rise Of Iron expansion

I’m hoping to expand on this program and add features as I go, leave me a comment and let me know if there’s anything you’d like to see, or if you fancy carrying me to the Lighthouse!

 

Connecting a Raspberry Pi to a WD MY Cloud Network Attached Hard Drive:

Recently during a house move, I dropped my good old reliable Raspberry Pi – hard drive Network Attached Storage unit.
Basically I connected my external hard drive to a Raspberry Pi and had a Network Attached Storage drive which allowed me to access my media from any device on my home network.

I then realised it was time to buy a dedicated NAS.

I shelled out for the WE My Cloud 4 TB server, which by all reports is a great piece of gear – however I wanted something more than a standalone NAS and I wanted to be able to access the WD My Cloud from my Pi.

Here’s how I connected my Pi to the My Cloud! The guide below should work for any NAS – not just the WD My Cloud.

Find the IP address of your NAS:

If you don’t know the IP address of your NAS, you can perform an “arp-scan” from your Raspberry Pi to find it, here’s how I found mine:

# sudo arp-scan --localnet --interface=wlan0

If you still can’t find the IP address, for the WD My Cloud you can find the IP address in your settings, as per the instructions here.

In my case my NAS IP address was: 192.168.192.62

Mounting the NAS to the Raspberry Pi:

The first step to accessing the NAS from your Raspberry Pi is to mount the external HDD as a file system on the Raspberry Pi, this will allow you to view the NAS, as you would any directory on the Pi. This is pretty easy actually, as the CIFS (CIFS Common Internet File Share, a protocol dictating how different OS’ share files between them, including Windows and Linux) protocol takes care of everything.

First make a directory for the share:

# mkdir wdmycloud

Next mount the drive using the IP address and the Raspberry Pi directory you want to mount to:

# sudo mount -t cifs -o guest //192.168.192.62/Public /home/pi/wdmycloud
mount command

mount command

In this example, I am mounting the “Public” folder located on my NAS to the wdmycloud folder located on my Raspberry Pi.

The command syntax is:
mount -t <Mount Type> -o <Access> <SOURCE> <DESTINATION>

After executing the mount command, you should now be able to access the NAS file system as you would any other directory!

Automatically mount the NAS on power up:

 

Edit the FSTAB to mount your NAS automatically on power up:

To make the mount permanent, we need to add the NAS file system to the Raspberry Pi’s /etc/fstab file – the File System Table.

# sudo nano /etc/fstab
Edit the FSTAB file.

Edit the FSTAB file.

Add the NAS as a file system in the FSTAB file.

//192.168.192.62/Public /home/pi/wdmycloud cifs guest     0       0
Adding the NAS to the FSTAB.

Adding the NAS to the FSTAB.

You can see from the last line in the FSTAB file above, I have added the NAS as a file system in my FSTAB file.

This will automatically mount the NAS every time you power up your Raspberry Pi!

Testing the NAS is connected automatically on power up:

First step, reboot your Pi… :

# sudo reboot
Reboot your Pi.

Reboot your Pi.

 

Next step, check your NAS directory from the Pi:

# cd wdmycloud
# ls -larth
Confirm the share is working.

Confirm the share is working.

Success! The mount works, I can now access all of my pictures, music and movies from my Raspberry Pi. Everything is safely stored on my WD My Cloud, which keeps 2 copies of all of my data – so if anything goes wrong, I’ll always have my data backed up 😉

 

Making a Twitter-Bot on your Galileo or Raspberry Pi

Twitter have developed an API (Application Programming Interface) for their website, which makes it really easy to send and receive Tweets from your Raspberry Pi or Galileo! The Twitter API takes all of the hard work out of writing a program to interface with Twitter, There are several ways to access the Twitter API, the easiest of which (in my opinion anyway 🙂 is to use the Twython package of the Python language.

Differences between Raspberry Pi and the Galileo:

It’s really easy to install Twython on the Raspberry Pi, a little harder to install on the Galileo – so for that reason, I’ll show the step-by-step instructions from the Galileo install. The only difference is the Galileo doesn’t require you to use the sudo command as you already have root permissions set. For example, when editing a file with the Galileo you would use:

# nano my_file.txt

When editing the file on a Raspberry Pi you would use:

# sudo nano my_file.txt

Changing the date on the Galileo:

In order to install some of these packages, you’ll need to update the date and time of your Galileo – this isn’t automatically done when you connect to the internet as you may expect. If you don’t update your date and time, you’ll get SSL certification errors when you try to download the Twython package.

To change the date and time – use the following command in the format year-month-day hour:minute:second:

# date --set="2015-01-20 10:00:00"

Installing Twython (and other Python packages):

In order to connect our computer to Twitter, we’ll need to download some Python packages:

  1. setuptools – this will allow you to “Easily download, build, install, upgrade, and uninstall Python packages.”
  2. pip – this is a  Python package installer.
  3. twython – this is the package which will actually interface with Twitter.

Here are the commands to install these packages – remember if you’re on a Raspberry Pi you’ll need to put “sudo” in front of each command.

# apt-get install python-setuptools

Here’s what that looks like on your computer, when asked “do you want to continue” as per the picture below – enter “y”. to continue.

Package 1 of 3.

Installing the setuptools Python package.

Next, install the “pip” package:

# easy_install pip

Finally, using pip – install the Twython package:

# pip install twython

 

Creating a Twitter App:

To create a Twitter App, you’ll need to sign up to Twitter and register the account as an application. This is important as you’ll need to verify this App with Twitter every time you use it. The verification method Twitter uses is called OAuth 2.0 to verify your App, this means you’ll never have to supply your password to 3rd party App developers but it does make it a little harder to verify your App – the good news is, Twython and other API’s handle all the OAuth pain, all you need to do is register your App and save the information. Register your app here: https://apps.twitter.com/ Click on “Create New App” and enter your information.

Let the fun begin!

Creating an App.

 

Changing App permissions:

As this is your App and you’ll want to be able to play around with it – you can chance the App permissions to allow you to read, write and access your direct messages. You can change these permissions at a later stage.

Change permissions.

Change permissions.

Authorize your account:

You need to create your access token and access token secret before you use your App. Click on the “Keys and access tokens” tab. Click on the “Create my access token” button.

Generate your secret token.

Generate your secret token.

You should now have all 4 pieces of required information:

  1. Consumer Key (API key).
  2. Consumer Secret (API Secret).
  3. Access Token
  4. Access Token Secret.

Now it’s time to write some Python!

Writing the Python:

Creating a Python file:

On your Raspberry Pi or Galileo, create a file called “Tweet.py” using the following command:

# nano Tweet.py

Now paste in the Python code:

Twitter Authorization:

In order to send  a Tweet, you’ll need to send Twitter your OAuth information. This process is handled by the Twython package. Here we are creating 4 string objects and a Twython object called “twitter”. When we create the Twython object we are passing the 4 strings to it an arguments. These are the access keys you generated in the previous section.

CONSUMER_KEY = '<YOUR CONSUMER_KEY>'
CONSUMER_SECRET = '<YOUR CONSUMER_SECRET>'
ACCESS_KEY = '<YOUR ACCESS_KEY>'
ACCESS_SECRET = '<YOUR ACCESS_SECRET>'
twitter = Twython(CONSUMER_KEY,CONSUMER_SECRET,ACCESS_KEY,ACCESS_SECRET)

Reading in a command line argument:

To make the code a bit more flexible, we can pass the text we want to Tweet into the Python script as a command line argument. This is done by using the system “argv” parameter. In our case – we only want to take the first 140 characters of this text, as this is the character limit set by Twitter for each Tweet. We do that by using the command:

sys.argv[1][:140]

Executing the code:

You can execute the code from the command line like so:

# python Tweet.py "Hello world."

Here’s what that looks like on the Galileo:

First Tweet!

Hello World!

Then check your Twitter Bot!

Installing Debian on the Intel Galileo

Some of the guys in the Intel communities forum have done a great job of porting the Debian Linux distribution to the Intel Galileo. If you’re not familiar with the Debian, it is an open source operating system used on Linux systems. This is great news for the Galileo as it makes it very easy to port any Raspberry Pi based projects directly to the Galileo!

Debian was ported to the Galileo by StuartAnderson, who posted on the Intel Communities page.

You can download the Linux image from here!

What you’ll need:

  1. Galileo.
  2. SD card.
  3. Serial cable.

Downloading the Galileo Debian disk image:

You can find the Galileo Debian image here. Click on the “SD card image” file. This will take some time as the file is 900MB. While the disk image is downloading, we can prepare the SD card.

Formatting your SD Card:

If your SD card has been used to write a Linux image before (used for a Raspberry Pi or Galileo Little Linux image), Windows can have trouble formatting the SD card. This is because Windows can only see the FAT32 partition and won’t see the much bigger Linux partition.

For this reason I used the SDformatter program from the SD Association, you can find it here (Click on Windows or Mac on the left).

You can see from the picture, I’m using an 8GB SD card (7.4GB shown but it’s 8GB, it’s normal for SD cards to be about 10% smaller than the advertised disk size), in my case Windows format only saw 56MB of the disk and ignored the remaining 7.35GB! However with the SD Formater program the full 7.4GB can be seen and will be formatted. Click on the Format button to start the process.

...

Using Rawrite32 to write the Galileo Debian disk image.

Writing the Debian Image to an SD card:

To write the Linux Debian image to the SD card you’ll need to download a disk image tool, as recommended by the original creators, I used the Rawrite32 Disk Image Tool software, found here.

After you’ve downloaded the Debian image, open Rawrite32 and select the Debian image to write.

Whoop - Debian!!!

Writing the Galileo Debian image.

You’ll see a pretty scary warning like this, but as long as you’re sure there’s no important data on the SD card that you need, go ahead and click “Yes”!

Alt text for this picture.

Click OK to start writing the Disk Image.

Once the Disk Image tool has finished writing – you’ll see something like this:

...

Finished writing the image.

You can see from the image above, the image file takes up 900MB, less than 1GB of my 7.4GB SD card. This means that the remaining 6.5GB of space on the SD card will not be available to me when I try to the Linux image. In order to make the remaining 6.5GB of space available, we need to grow the size of our partition on the SD card.

Now that we have a Debian Linux system available – this can be done from the Galileo itself!

Expanding the filesystem:

To expand the filesystem, first we need to expand the partition, then expand the file system to fill that partition.

When formatting the SD card, we could see that the 8GB SD card had a size of 7.4GB, so I’m going to expand my partition to 7.4GB just to be on the safe side.

Disk size before partitioning:

Not enough room!

Before resizing the filesystem.

Expanding the partition:

To resize the partition, we are going to use a program called Parted. This can be used to create, resize, move and destroy disk partitions. The first command used is the resizepart command.

# parted /dev/mmcblk0 resizepart 2

Select “y” to continue and then enter the size of the partition you want to create, here’s where I entered the 7.4GB (which works out as 7400MB).

# parted /dev/mmcblk0 resizepart 2

# parted /dev/mmcblk0 resizepart 2

You’ll need to reboot here!

# reboot

Expanding the filesystem:

When the partition has been resized we create a journal inode.

# tune2fs -j /dev/mmcblk0p2
# tune2fs -j /dev/mmcblk0p2

# tune2fs -j /dev/mmcblk0p2

You’ll need to reboot here!

# reboot

When the reboot has finished, finish by expanding the filesystem to the partition size.

# resize2fs /dev/mmcblk0p2
# resize2fs /dev/mmcblk0p2

# resize2fs /dev/mmcblk0p2

The resize2fs will expand the filesystem to fill the partition, this will take about 10 minutes but when it’s finished you should see a message like this, above.

Disk size after expanding the filesystem:

Loads of room!

Disk size after expanding the partition.

From the above image you can see, we’ve grown the filesystem to fill the partition, so we now have a total disk size of 6.8GB, with 6GB available to play with!

Now that the filesystem is expanded, it’s a good idea to update the system.

apt-get update && apt-get dist-upgrade

The update and upgrade will take about 10 minutes, then you can have fun playing with your new Debian computer.

Creating a Web Server on the Galileo: Using the Arduino IDE.

Creating a Web Server using the Arduino IDE example:

The Arduino IDE comes with a prebuilt Web Server example. If you’re not sure what a Web Server is, or what it’s used for – check out my blog post here -> Creating a Web Server on the Galileo

It's pretty sweet actually.

Arduino Example for creating a Web Server.

The above example comes with enough code to:

  • Create a Web Server on the given IP address.
  • Create a blank HTML page.
  • Read the analogue inputs and display their values on the HTML page.

How the Arduino Web Server works:

The Web Server uses the Arduino Ethernet library to answer any HTTP requests made to the Galileo. The Galileo already has an Ethernet connection on board and fully supports the Ethernet library.

Here’s how the library is explained on the Arduino website:

The library allows the Arduino device to connect to the internet. It can serve as either a server accepting incoming connections or a client making outgoing ones. The library supports up to four concurrent connection (incoming or outgoing or a combination).

The Arduino example does not conform to the full HTML page structure, and instead relies on printing text between the opening and closing HTML tags.
This is not the ideal solution for creating a fully functioning HTML page but it is a very quick and easy way to control your Galileo via a web browser. The example is pretty bare-bones, a HTML only website with no CSS styling or server side scripting functionality. Without any JavaScript or PHP, your page needs to be refreshed anytime the data changes.

Creating the Web Server in the Arduino IDE:

We start the Web Server by initialising the Ethernet class, to do this we need to call the Ethernet.begin() function and pass the required MAC and IP addresses to the function.

byte mac[] = { 0xDE, 0xAD, 0xBE, 0xEF, 0xFE, 0xED };
IPAddress ip(192,168,1,177);

void setup() {
//  Ethernet.begin(0xDE, 0xAD, 0xBE, 0xEF, 0xFE, 0xED, 192,168,1,177); 
    Ethernet.begin(mac, ip);
}

Once the Ethernet class has been initialised, we need to tell the server to listen for incoming connections on port 80, this is the default port for HTML requests, you can change this port number, which I will be doing in the Node.js example.

EthernetServer server(80);

void setup() {
  server.begin();
}

When the server is running and a client is available (connected via a web browser), the client.println() function is used to send data to all the connected devices. HTML can be printed directly using the client.println() function.

void loop() {
  client.println("<title> Allyns webserver: </title>");
}

The above code can be used to build a fully functioning Web Server.

Note:

At the time of writing there is a bug in the Arduino Ethernet library, which is discussed and a fix is suggested here.

« Older posts Newer posts »

© 2024 Allyn H

Theme by Anders NorenUp ↑