Tuesday, May 15, 2018

Final Project Code

 #!/usr/bin/env python
# Display a runtext with double-buffering.
from samplebase import SampleBase
from rgbmatrix import graphics
import time
from google.transit import gtfs_realtime_pb2
import urllib
import time
from operator import itemgetter
import threading

class RunText(SampleBase):
    def __init__(self, *args, **kwargs):
        super(RunText, self).__init__(*args, **kwargs)

    def run(self):
        global a_data
        global f_data
        state = False
        offscreen_canvas = self.matrix.CreateFrameCanvas()
        font = graphics.Font()
        font.LoadFont("../../../fonts/5x8.bdf")
        pos = offscreen_canvas.width

        time_since_refresh = 0
        time_since_state = 0
        while True:
            if state == True:
                master_data = a_data
                color = graphics.Color(0,57,166)
            if state == False:
                master_data = f_data
                color = graphics.Color(100,39,10)
            sorted_arrivals = sorted(master_data, key=itemgetter('arrival_time'))

            offscreen_canvas = self.matrix.SwapOnVSync(offscreen_canvas)
            offscreen_canvas.Clear()

            # Current UNIX time
            current_time = time.time()

            i = 0
            count = 0
            while count < 4:
                arrival = sorted_arrivals[i]
                route_id = arrival['route_id']
                arrival_time = arrival['arrival_time']
                direction = arrival['direction']
                minutes = int((arrival_time-current_time)/60)
                i += 1

                # ignore if train is at station
                if minutes <= 0:
                    continue

                graphics.DrawText(offscreen_canvas, font, 1, 7+(8*count), color, route_id)
                graphics.DrawText(offscreen_canvas, font, 7, 7+(8*count), graphics.Color(255,255,255), direction)
                offset = 11
                if minutes >= 10:
                    offset += 5
                graphics.DrawText(offscreen_canvas, font, offscreen_canvas.width - offset, 7+(8*count), graphics.Color(127,255,0), str(minutes) + "m")
                count += 1

            if time_since_refresh > 800:
                f_data = getArrivals(station_codes, 21)
                a_data = getArrivals(station_codes, 26)
                time_since_refresh = 0

            if time_since_state > 8:
                state = not state
                time_since_state = 0

            time_since_refresh += 0.5
            time_since_state += 0.5
            time.sleep(0.5)

def getArrivals(station_codes, feed_id):
    feed = gtfs_realtime_pb2.FeedMessage()
    # get ACE routes
    response = urllib.urlopen('http://datamine.mta.info/mta_esi.php?key=84167bd227ea2855b51d24072b701861&feed_id=' + str(feed_id))
    feed.ParseFromString(response.read())

    arrivals = []
    for entity in feed.entity:
      if entity.HasField('trip_update'):
        for stop_time_update in entity.trip_update.stop_time_update:
            if (stop_time_update.HasField('stop_id') and stop_time_update.stop_id in station_codes):
                route_id = entity.trip_update.trip.route_id
                arrival_time = stop_time_update.arrival.time
                direction = stop_time_update.stop_id[-1:]
                if direction == 'N':
                    direction = 'Uptown'
                if direction == 'S':
                    direction = 'Downtown'
                arrivals.append({'route_id': route_id, 'arrival_time': arrival_time, 'direction': direction})
    return arrivals


# Main function
if __name__ == "__main__":

    # W 4 Station Codes
    # http://mtaapi.herokuapp.com/stations
    station_codes = ["A32N","A32S","D20S","D20N"]

    f_data = getArrivals(station_codes, 21)
    a_data = getArrivals(station_codes, 26)

    run_text = RunText()
    if (not run_text.process()):
        run_text.print_help()

Saturday, May 12, 2018

Final Project Prototype

The final project really came together when assembling the last exterior components. I acquired a great quality white acrylic board and a thin sheet. The acrylic board was used to form the solid exterior casing around the electronic components and the thin sheet was placed over the LCD Matrix to diffuse the light in an aesthetically pleasing way.

I first created the box layout in Illustrator by measuring the dimensions of the LCD Matrix and providing the appropriate margins of space to fit the electronic components. Added holes for the LCD Matrix opening and power cord insertion. 

A case generator was used to create the edge mounting pattern on the initial case, with further edits made in Adobe Illustrator

I tested the above layout by laser cutting the vector on cardboard to measure the dimensions and ensure that the settings would be correct for the final laser cutting process. I then used the white acrylic and dialed up the settings to the recommended ones to cut through the acrylic. The first laser cutting went perfectly so luckily there was no trial and error process there.

The thin sheet was mounted to the LCD Matrix using 4 spaced out dots of hot glue. The dots were placed strategically to provide the most even stretch for the material without crumpling. I started to assemble the front and sides of the acrylic casing using a hot glue lining. I placed the LCD Matrix in the center of the assembly and mounted it using hard cardboard spacers. The spaces were calibrated perfectly so that the LCD Matrix would sit in the exact center of the acrylic assembly. Hot glue was generously applied to the hard cardboard spacers to finalize their placement. 

In order to increase the durability of the external parts, I hot glued the thicker part of the power cable to the power cable hole, ensuring that it would exit the assembly at an exact perpendicular angle.

The videos below show the product in action, using A and F line data from the West 4th Station.





Tuesday, May 8, 2018

Final Project API Integration

Replacing the static data with real-time MTA subway data was a task that required more work than previously estimated.

The first step in the process was registering for the MTA Real Time data API. Once access was granted, the API key was used to make GET requests to the real-time endpoints. Instead of providing the data in an easy-to-read JSON format, the API provides the data in the GTFS Format (General Transit Feed Specification). The GTFS format is the format in which public transit agencies publish their real-time data using time updates.

In order to read the GTFS data, I found a GTFS parser library to convert the data into a readable Python object. Once the data was converted, I outputted the entire object into a console log in order for me to analyze it further. Closer inspection revealed a huge set of inconveniently nested data. I used for loops and validation checks to iterate through each level of the object and eliminate uneeded data. Because I wanted to configure the train sign to display data from the W 4 subway station, I first eliminated all updates that did not concern routes that traveled through the W 4 station. Once I had retrieved the filtered data set, I then analyzed all UNIX timestamps and sorted W 4 arrivals by the difference from the current time. This allowed me to view the arrivals in order of closest to furthest. I also used substring logic to determine if a trip was going Uptown or Downtown by looking for the suffix "S" or "N" attached to the trip name.

I created a new array of dictionaries using the sorted data to pass to the display function. UNIX time differences were divided by 60 to find minutes until arrival. I eliminated all train arrivals which rounded down to 0 minutes, in an effort to hide trains that were already stopped at the station. I created independent timers using the main thread to update the display using a certain interval, and pull new API data using a different interval for performance purposes. Although the future estimations in the data feed allow for the use of cached data, I still wanted to update the feed every few minutes in order to account for unknown delays and route changes.

Finally, I implemented both A and F lines by pulling data from both feeds and setting up another independent timer to switch the display between displaying A lines and F lines.

Saturday, April 28, 2018

Final Project Alpha

Connection to the Raspberry PI was successfully established using the WiFi dongle setup. I wanted to get set up without having to use Ethernet or external monitors, so I researched different ways to set it up remotely.

I eventually decided to burn the Raspberry Stretch OS onto the SD card and make the OS automatically connect to the WiFi upon booting up. I edited some configuration files in the OS so that it would automatically connect to my iPhone's hotspot using the WiFi dongle. Once booted, I was able to verify connection by seeing that the iPhone hotspot bar displayed 1 connection.

In order to connect to the RPI using ssh terminal, I had to determine the IP address that the RPI was using on the WiFi network. To do this, I connected to the hotspot with my laptop and used the "arp -a" command to view all other local IP addresses. I found the correct IP address and established an SSH connection from my terminal to the Raspberry PI. Immediately after logging in for the first time, I set up my private keys so that I wouldn't have to put in my password every time I connected to the Raspberry PI. I also updated all of the Linux packages so that the Raspberry PI would have the latest libraries and tools that I needed.

Connecting to the RPI eventually worked via SSH

In order to test the Raspberry PI, I made a simple python program that would turn an LED light on and off every few seconds. This test was a success, so I decided to wire up my LCD Matrix.


Before wiring up the LCD Matrix completely, I decided to test the matrix without wiring to see that it was lighting up and checked for any defects in the hardware. There were some pixels that were not as bright as the other ones, but I determined that there were no major defects that would prevent me from proceeding with the setup.

After wiring the LCD Matrix, I realized that it would be much better to configure and test the LCD Matrix with the Adafruit RGB Matrix HAT + RTC for Raspberry Pi. This ended up being a great purchase because it comes with the correct configuration needed to work with the Adafruit LCD Matrix library, which I needed to use to interface with the LCD Matrix. It also had connections to power the Raspberry PI, which allowed me to attach a 4 amp power cable into the HAT to power both the LCD Matrix and the Raspberry PI at the same time. This eliminated the need for multiple power connections. I did not end up using the Real Time Clock which came with the hat.

Setting up the Adafruit RGB Matrix HAT + RTC required soldering about 50 different connections on both sides of the HAT
Once the HAT was connected, I was able to test the LCD Matrix. I used text libraries to display text on the display. I soon realized that the pixel density of the display would only allow me to display the train times clearly, and other data sets would have to be left out otherwise they would add too much clutter to the screen.


Using sample static data, I was able to start adding text to the display. I worked to fine tune the display of the text by playing with different font sizes so that they would fit on the screen. I also edited the font file itself to reduce kerning between each letter, and the width of the space character, so that it would be possible to fit the train time on the horizontal dimension of the display. After figuring that out, I added the colors that I thought would add an aesthetic quality to the display.




Thursday, April 12, 2018

Final Project Update

I have so far not encountered anything I would potentially have difficult with in setting up the raspberry pi. The only obstacle is obtaining all the equipment needed to properly deploy the system. For now I have been using an HDMI cable to connect to a display here at MAGNET. I will be later transitioning to using VNC viewer to connect to Raspberry PI through ethernet port, because locating a display everytime I want to work on theproject is just far too time consuming. I still need to obtain an Ethernet to USB adapter so that I can properly connect my laptop to it, and download and install the firmware to the SD card. I will also obtain the wifi component for the raspberry pi so that it does not have to permanently connect to the router. Finally, need to obtain micro 2a usb charging cable because right now I am using  a borrowed micro  from Elton. In regards to screens, I have been considering the TFT screen, or a higher fidelity LCD screen, and will send some examples for your review

Items to Acquire ranked by Priority
- Ethernet to USB Adapter cable (P0)
- Micro 2a USB charging cable (P1)
- Raspberry Pi WiFi component (P1)
- TFT screen (exact model unknown) (P2)

I should have wiring of a basic LED setup by class today and will expand on it more once I acquire the materials above

Milestones:

- Week 11 - Install firmware, order required components shown above, Basic LED Wiring setup with Raspberry Pi
- Week 12 - Alpha version will connect to WiFi and collect data from basic API
- Week 13 - Deploy node.js Server API and connect to rpi api call
- Week 14 - Complete outputs for Server API by attaching LEDs and set up screen
- Week 15 - Complete outer casing and design components

Thursday, April 5, 2018

Interactive Objects Systems Final Project Pitch

https://drive.google.com/file/d/1YIYDg2ZUSFrSHNBIH7UMymhAHKXprdDK/view?usp=sharing

Weather Visualization API Project

This project took a lot of time to formulate and to make, but produced a product that is definitely very cool to use. When starting the project, I originally wanted to use the WiFi shield to collect data from an API. I started soldering one and then later realized that I had received the wrong shield, so I acquired the correct one.

Adafruit WINC1500 WiFi Shield

Soldering the WiFi board involved attaching 5 sets of header pins to the outside pin holes, and one upside down on the right-most section of the image above.


Using the soldering iron to properly attach the header pins at the correct angles and positions

We then set to work brainstorming possible applications of the WiFi Shield. The idea we decided to execute was one that utilized the weather API to collect weather data and display it in a creative way.

Our initial sketch of the finished product

We decided to use the LCD panel to display the location name and temperature, and two LEDs to indicate whether the two attributes shown on the bottom of the panel, i.e. Cloudy and Humid were either true or false. A lit LED would indicate true, and unlit would indicate false.

To add more dynamic behavior to our project, we decided to use two states to display the data. One state would show the temperature in Fahrenheit, and the other would show the temperature in Celsius. The two attributes on the bottom of the LCD would also be different. The program alternates between the two states every 4 seconds to allow for equal viewing of each set of data.

The next step was programming the board to make a GET request to the API of our choice: OpenWeatherData. We created an account on OpenWeatherData to authenticate the request. The request would receive a response including a JSON body which would contain the data that we needed. We initially encountered a problem when setting this up, because our board did not have enough memory to support all the libraries that were recommended for this task.

We were kindly provided a Metro Express board, but we had trouble with the driver to connect it. Some research that we did online suggested that the cable we were using to connect was either sync or non-sync and was not compatible with the serial connection needed to establish with the board. Another option we would have been able to use was using multiple Arduino Genuino Uno boards, but we ultimately decided a different solution to the problem.

Most implementations of the API recommend using a library called ArduinoJSON to parse the JSON data directly from the response. However, because we did not have enough memory, we chose to omit the ArduinoJSON library from our implementation. We replaced the library with our own parsing  which breaks the stream of data from the response by quotation mark characters, and uses an indexing system to extract the desired data field. Once the indexed data field is extracted, we go further and take a substring from it to remove miscellaneous characters that are part of the JSON body structure. Once these characters are removed, we must then cast the String into the desired object, such as double or int. Arduino has helpful functions such as toInt() to accomplish this.

Sample Extracted Data

Now that the data was properly being retrieved from the API, it was time to wire up the components. We used the existing wiring for the LCD and attached the two required LEDs which would indicate whether the attributes on the bottom of the  LCD were true or false.

Establishing the proper wiring for the board


While we were fixing some initial problems in the wiring, we also were working on the thresholds for which each set of data would trigger the true or false output. We found averages of each of the values in the extracted data and made thresholds that would result in true or false if they were above the average. For example, above 63% humidity would trigger the LED for that attribute.

We discovered a problem using UNIX time to compare the sunrise and sunset data, because it seemed that Arduino did not have a great way of establishing the current time without a RTC component. Furthermore, it was surprising to discover that there were not many tools within the libraries related to UNIX time. We later replaced the sunrise and sunset data with Rain.

After fixing the wiring and other issues, we were able to produce a working prototype:


A view of State 1
A view of State 2







Thursday, March 22, 2018

Fortune Teller LCD with Shift Register

Ideation

The original idea that I had was to utilize the flex sensor. As the flex sensor is bent one direction, more negative fortunes will appear, and if the flex sensor is bent the other direction, positive fortunes will appear.

The reason I chose this idea is because I thought it would be interesting if the user has the power to influence the quality of their fortune, which is something typically not an ability someone would have in fortune telling.

Positive fortunes:

You will reconnect with an old friend.
You discover something new today.
You will get a good grade on an assignment.


Negative fortunes:

You will forget an important task today.
Your next train will be delayed.
You will get a bad grade on an assignment.

Execution

The first step was to wire the LCD board to the Arduino using the shift register.



I followed the above Fritzing diagram to achieve the wiring. Unfortunately, after doing this I discovered that there was a problem connecting to the board. After some trial and error, I discovered that it was because there was some sort of shorting in my setup. Because of this problem, I could not properly set up the board with the shift register.





















Here is an image of the wiring for the LCD without the flex sensor.




LCD wiring including the flex sensor.

I was not able to properly actualize the idea that I had. I did however take the time to lay out the foundations for the code that would power the fortune teller.

I installed the Adafruit_LiquidCrystal library in Arduino. Using the flex sensor code that I had developed for a previous assignment as a reference, it was relatively easy to incorporate the detection of different states into the program.

https://github.com/Fuzzlr/WebDev_S18/blob/master/LCD/LCD.ino

The code above utilizes arrays to store each output and uses the sensor reading of the flex sensor to determine whether the output is a positive or negative message.

UPDATE:

I was able to properly wire the LCD and produce the desired results with updated code.





Link to updated code:



Thursday, March 1, 2018

Class Glove Exercise Rewrite

I have completed the rewrite of the code with a more sensitive detection for gesture thresholds instead of using states. By saving the past value in the code, I am able to detect if the opening or closing gestures are occurring for over 100 milliseconds. This can be changed depending on how the program is calibrated. The code was commented to show each step of the implementation.

Another way this could have been achieved would be with different boolean flags and use of the millis() function.

Alternatively, the program could have also been made in a way that sets thresholds on the sensor value instead of time.

https://pastebin.com/Uk4MMP7k