Really Awesome
A Project By Just A Team !
Pujun Lun, Ruoxuan Xu, Xiaoxing Yan
Music beat matching game apps are gaining popularity these days. Some apps like Tapsonic and BeatMania have implemented the function of synthesizing the music beat with moving colorful objects on screen, and interact with players’ touch during the game.
The methods these games use for visualize beats and notes for music are a combination of programs and manual refinement. That is because the beat detected automatically, in most cases, does not perfectly accord with human sense. That is why in our program we both generated the music map file manually and by programs. Some rhythm analysis libraries are available in python to make use of.
Our project mainly can be divided into two parts: hardware design and software design.
And the following are the steps of every part, along with the challenges we met.
Hardware Design
Figure:Four Keys
--Challenge
In this session, we need the strike of our palms be detected by the hardware equipment and then the signal be transmitted to RPi via GPIO. In this case, the strike can be recorded when playing music according to the rhythm.
--Problem
At first, we planned to use piezoresistive film sensor to record the stirke, which changes the resistance once being pressed. However, the problem of this material is that the response time is almost 0.5 s, which is not feasible to serve as a real-time input. After we found this problem, we started to find other suitable materials to meet real-time requirement.
The first solution we searched was the product of Makey Makey device, which can serve as the input to Raspberry Pi. But soon we realized that this device is better to connect with computer rather than a RPi. Besides, this device may need to be initialized in order to work with RPi according to the tutorial on the website.
Then we tried to use the LM358 amplifier to detect the very fast pressure changing events in the sensor films. And this circuit has already been offered on the website.
Yet, after we studied the circuit, we found this amplifier circuit is to track peak voltage rather than to detect very fast pressure changing events.Finally, we were inspired by the idea of membrane key, of which the circuit can be connected with conductive material.
--Solution
We implemented four keys using the basic circuit showing below and set the circuit to open circuit in the place of switch 1. At the same time, we placed the foam obliquely and fixed the aluminum foil underneath the foam. In this case, when players tape the key, the aluminum foil can be attached to the wire and then the circuit can be connected.
The picture on the left shows what the open circuit like. We bend the wire into spiral to increase the effective area.
--Testing
The test of hardware part is simple to implement: the circuit is connected when there is an input to GPIO. Therefore, our team produced a simple python file, which print accordingly when corresponding GPIO receiving a low-level signal.
Software Design
-- Pygame running on TFT to display the user interface.
The main page we designed is a list of different pieces of music, indicating respective difficulty of catching the beat. When clicking ‘quit’ button on the screen, the program will exit and return to initial consoler of Raspberry Pi. Whereas clicking on one of music names, the program will start with the music played at a specified time point, which matches the time when the sliding page has been initialized and colorful blocks are hitting the bottom of the screen.
After the program starts, we will firstly see 4 rails on the screen with a lying pink band on the bottom, which we refer to as ‘Verification Zone’. For every frame (1/60 s) a fixed length of data in the map file are read, according to which a certain length of a colorful block is generated and appears on one of the four rails, at the top of the screen.
Then the next frame comes and if the following digits in the map file do imply the continuity of the previous block, the block will grow into a larger length, or if not, the block is merely at previous size and just slide down with a specified moving distance. The piTFT will detect if there are any clicks on screen, if there is, then the program will quit and get back to the consoler of RPi. Also the GPIO will collect key tapping information.
When the sliding block is just in its way to pass through the Verification Zone, if signals from GPIO is detected, the respective area of the Verification Zone will turn into the same color with that sliding block, thus indicating the user is tapping the keys in the correct time.
To implement the virtual buttons on the starting page, we just utilize what we have learned in the labs with functions of pygame.font.render and screen.blit in display.py, to set the text surface with designated positions and display them on screen. We also modified the code on mouse click detecting, so the click on piTFT would be detected and based on the coordinates of the clicking, the program will continue or quit.
To display the four colorful blocks sliding down the rails is a little more complicated, as the blocks are trapezoids which change positions and shapes every frame. When hit the bottom stripe of the verification zone, the top edge of the trapezoid stays the same speed to move down, while the bottom edge get obstructed by the verification zone and the involved area all turns into some color accord with the trapezoid passing by.
In Visualize.py, We firstly set the widths and positions for railways, which begin at the very top and extend to the edge of the verification zone, specifying each of the block with the variant ‘index’. Then we set fixed slopes for the left and right boundary of each railway, and a certain vertical speed (2 pixels / frame) for the trapezoid to change the Y-coordinates. Then we designed the function of _get_points() so that in every frame it will refresh the four vertices of trapezoid, base on the constant speed of their Y-coordinates increment and the specified slopes. The only exception happens when the computed Y-coordinate is below the width of the pink band or out of the height of screen, in which case that Y-coordinate just stays the same.
Finally we can get the trapezoid displayed by screen.render_polygon(), and we set different color according to the value of index. To make the trapezoid actually move, we just do what we did in the lab. We display the image and hold still for 1/60 seconds, and we use screen.fill.black() to clear all the graphs and make new ones on the next 1/ 60 second.
Another thing about display is the color changing of the verification zone. In keyboard.py, we define the function of key_status() to return the list of GPIO input to the main function of virsualize.py. The return value is taken as the input to indicate whether the 4 keys are being partially or all pressed. Then this value of ‘pressed’ is used in move_all_trapezoids() function in virsualize.py, only when it is True and there is a trapezoid as well as the trapezoid bottom reaches the edge of verification zone, will the corresponding zone area changes color.
--To detect beats and notes and turn them into the appearance frequency of sliding blocks.
The latest version of our code uses a library called essentia to detect beats. It can analyze beats of any music file, and output a numpy array, which indicates at which point of time there is a beat. The major defect of this library is loading it into the main memory costs 15 seconds. We don’t want our users to wait for that long a time before they enter the application every time, so we choose to only analyze every music file once, generate a map file, and then reuse the map file in the future.
In our test, to analyze the rhythm of a 3 minute music, it takes about 17 seconds. To reduce it, we separate the data read from the music file into four parts, and use the multiprocess module of Python to make four CPU cores analyze the data in parallel. After this modification, now the analyze costs only 5 seconds.
Since the output is just points in time where beats occur, we have to use another trick to make sure that when we display color blocks that indicate the rhythm, the block should be long enough to be caught by the user. When we are going to play a music, we read the map file, and convert it into an array that contains points in time. Then in each frame, we find out whether we should display a block. When we should, we will set the current state of a certain rail to the designated length of the block, say 30 pixels. In each following frame, the state subtract the speed at which the trapezoids move, until it becomes zero. Whenever the state isn’t 0, the trapezoid will gain in length, so that its length will eventually become our designated value.
The function from essentia is fairly good to detect music beat that matched human sense. Before that we tried several libraries to detect beats and notes, but they did not work well.
At the very beginning, we tried to use Fast Fourier Transform (FFT) to detect the rhythm of music on the fly. To be more specific, with the wav library, we firstly computed the FFT for the music in time period of 1/60 s every 1/ 60 s on the fly, and the outcome of FFT was the magnitude of music on different frequency.
Classify the outcome in the way of frequency, we set them into 4 channels in the order of high to low. Depending on which of the channel has the largest magnitude, we created a trapezoid with a rather small length. Then in the next 1/60 s, if the highest magnitude still fell in to the previous channel, the trapezoid would gain in length. However if not, it would only show a narrow stripe on the screen rail and end behind the verification zone so quickly that users could hardly play. Basically, those blocks generated directly by FFT were not sufficient in representing the beats and notes of music. Most of the time the blocks are split into different channel and become very short. The compensation we made for this was to perform FFT several frames before the display of trapezoid, so we could see how long each trapezoid could be, and filter out those short ones. However, even the short ones were ruled out from screen, the block appearing still failed to match the beat of music.
-- Time control.
Time control is crucial in our project. We have to make that what you hear and and what you see on the screen are always synchronized, otherwise users may doubt that whether the program is visualizing the rhythm of the correct music.
We need to control the frame rate precisely when we manually generate the music map file, and when we visualize the rhythm on the screen. To determine when to show show a beat, we also need to know how much time has elapsed since the music starts, because the output of the beat detection is also points in time since the music starts. We chose to use the time module of Python. In a early version when we were trying to do rhythm analyze in real time, we tried to use the threading module in Python to open another thread to do FFT. The looked like:
import threading
def _fft():
threading.Timer(interval, self._fft, args).start()
# do FFT and display the result
self._fft()
This code means whenever we call the function self._fft(), it will start a new thread after a specified time interval to do the next analyze and display. The time interval is set according to the frame rate. However, it turned out to have a dozens of seconds delay after a three-minute music finishes playing. So we modified the code, and do all the analyze in the main thread:
from time import time, sleep
while running:
timestamp = time()
# analyze and display
while time() - timestamp < interval:
sleep(0.00001)
At the very beginning of the start of each loop, it obtains a timestamp, and the next loop is supposed to begin after a specified time interval. We used this code to read the map file of a twenty-second music, the final elapsed time was not so bad if we ran the program on our own laptop, say 23 seconds. However, it became around 28 seconds when we tested on the RPi. So even if no computation-intensive code was put after the sleep, the error was still large. Actually we just need a small modification to make it really precise:
timestamp = time()
An even better way is:
while running:
# analyze and display
while time() - timestamp < interval:
sleep(0.00001)
timestamp += interval
timestamp = time()
Initializing the time counter out of the loop, and increasing it by a specified interval in each loop, ensures that it is always a multiple of the specified interval. We can also know how much time has passed since the first loop starts through the variable display_start.
display_start = timestamp
while running:
# analyze and display
sleep_time = timestamp + interval - time()
if sleep_time > 0:
sleep(sleep_time)
timestamp += interval
Another challenge we met is how to let the music start playing in a specified point in time. We can play a music using Pygame easily:
pygame.mixer.init()
However, in our test, even if we execute the init() and load() function long before executing play() to ensure that the music file has already been read into the main memory, there will still be a delay between calling play() and the RPi really making sound. It is possible to call play() ahead of displaying the rhythm, but since the length of delay is not always the same, the dismatch still exists.
pygame.mixer.music.load(music_file)
pygame.mixer.music.play()
In our final solution, before displaying the rhythm, we open a mplayer using the subprocess module of Python, let it play the music but immediately send it a “pause” command through FIFO, so that it will not make sound until we send it a second “pause”. Using mplayer and FIFO makes the delay a lot more negligible, though writing the command to the FIFO that resides in the hard disk still costs a tiny amount of time. For further improvements, it is worth trying to use pipes or sockets that does not involving disk I/Os to obtain more precise time control.
Software Design
We have reached our goal of creating both the manually made map file and program made map files to generate the animation of colorful sliding blocks.
By starting from our main page, the player can choose a music from the difficulty list, and our piTFT will start the display of that animation, synchronized well with the music being played. We have modified several versions of the polling loop in the main function, and in the end successfully controlled the execution time of python statements in every loop, so that the time error is limited within an imperceptible range for a three-minute music. Besides, we have implemented the parallel computing algorithms to accelerate the process of reading the numpy array, which indicates the music beat information, into main memory. The time this process takes is shrunken from 17 seconds to 5. Based on what we have achieved, players can start playing the game shortly after the song is chosen and tap the keys at the same time point with the animation and the music.
Hardware Design
The implementation of four keys are successful as well. The key is sensitive enough to detect very fast tapping from players almost with no delay. Besides, the four keys are fixed in a steady position and arrayed in a half moon circular shape to provide a better user experience.
xy363@cornell.edu
--Wrote the code of key-initialize for software design.
--Refined the map file by limiting shortest blocks size.
--Made and refined keys with Ruoxuan Xu.
rx65@cornell.edu
--Programmed to generate manually made map file.
-- Created and tested manually made map files.
-- Tested and refined electrical keys with Xiaoxing Yan.
pl557@cornell.edu
--Wrote and tested the code of detecting beats,
reading and writing map files as well as visualizing the rhythm of the music
--display.py is a wrapper class of Pygame.
It includes methods to create, refresh and clear a screen, render graphics and get the mouse click position.
import pygame
import os
width, height = 320, 240
WHITE = 255, 255, 255
RED = 255, 0, 0
GREEN = 0, 255, 0
BLUE = 0, 0, 255
YELLOW = 255, 255, 0
PINK = 255, 192, 203
BLACK = 0, 0, 0
class Screen(object):
def __init__(self, on_tft=False):
if on_tft:
os.putenv('SDL_FBDEV', '/dev/fb1')
os.putenv('SDL_VIDEODRIVER', 'fbcon')
os.putenv('SDL_MOUSEDRV', 'TSLIB')
os.putenv('SDL_MOUSEDEV', '/dev/input/touchscreen')
pygame.init()
self.screen = pygame.display.set_mode((width, height))
pygame.mouse.set_visible(not on_tft)
self.clock = pygame.time.Clock()
def clear(self):
self.screen.fill(BLACK)
def render_circle(self, center, radius, color):
pygame.draw.circle(self.screen, color, center, radius, 0)
def render_polygon(self, points, color, width=0):
pygame.draw.polygon(self.screen, color, points, width)
def render_text(self, text_pos, font, color):
text_font = pygame.font.Font(None, font)
for text, pos in text_pos.items():
text_surface = text_font.render(text, True, color)
self.screen.blit(text_surface, text_surface.get_rect(center=pos))
def tick(self, frame_rate):
self.clock.tick(frame_rate)
@classmethod
def display(cls):
pygame.display.flip()
@classmethod
def get_click_pos(cls):
return [pygame.mouse.get_pos() for event in pygame.event.get()
if event.type is pygame.MOUSEBUTTONUP]
-- keyboard.py is a wrapper class of RPi.GPIO.
It includes methods to initialize, clean up and read status of GPIO pins.
import RPi.GPIO as GPIO
def key_initiate(all_pin):
GPIO.setmode(GPIO.BCM)
[GPIO.setup(pin, GPIO.IN, pull_up_down=GPIO.PUD_UP) for pin in all_pin]
def key_status(all_pin):
return [not GPIO.input(pin) for pin in all_pin]
def key_clean():
GPIO.cleanup()
--manual_make.py is used for generating the map file manually.
It uses RPi.GPIO to read the status of buttons in frames,
turn into an array and store into the map file.
import RPi.GPIO as GPIO
import keyboard
import pygame
from time import time, sleep
import numpy as np
dict_pin = {5: 8, 6: 4, 13: 2, 26: 1}
all_pin = [5, 6, 13, 26]
keyboard.key_initiate(all_pin)
pygame.mixer.init()
pygame.mixer.music.load("AWA.mp3")
pygame.mixer.music.play()
record = []
timestamp = time()
interval = 1.0 / 60
running = True
while running:
try:
current = 0
for pin, number in dict_pin.items():
if not GPIO.input(pin):
current += number
record.append(current)
sleep_time = timestamp + interval - time()
if sleep_time > 0:
sleep(sleep_time)
timestamp += interval
except KeyboardInterrupt:
running = False
np.save("record", np.array(record))
GPIO.cleanup()
--visualize.py is built on the top of display.py.
It includes classes Trapezoid, Note, Visualizer and MapReader.
The first three are used to create the user interface of our application.
The last one, MapReader, is used to read manually made map files, and display using Visualizer.
import display
import keyboard
from time import time, sleep
import numpy as np
import subprocess
verify_height = 40
trapezoid_top_width = 10
top_anchor = range(display.width / 2 - trapezoid_top_width * 2,
display.width / 2 + trapezoid_top_width * 2 + 1,
trapezoid_top_width)
bottom_anchor = range(0, display.width + 1, display.width / 4)
select_pos = [(160, 30), (160, 90), (160, 150), (160, 210)]
class Trapezoid(object):
def __init__(self, screen, top, bottom, index, left_slope, right_slope, color, speed):
self.screen = screen
self.top = top
self.bottom = bottom
self.index = index
self.left_slope = left_slope
self.right_slope = right_slope
self.color = color
self.speed = speed
def _get_points(self):
return [[(self.left_slope
and (self.top / self.left_slope + top_anchor[self.index],)
or (top_anchor[self.index],))[0], self.top],
[(self.right_slope
and (self.top / self.right_slope + top_anchor[self.index + 1],)
or (top_anchor[self.index + 1],))[0], self.top],
[(self.right_slope
and (self.bottom / self.right_slope + top_anchor[self.index + 1],)
or (top_anchor[self.index + 1],))[0], self.bottom],
[(self.left_slope
and (self.bottom / self.left_slope + top_anchor[self.index],)
or (top_anchor[self.index],))[0], self.bottom]]
def render(self, color=None):
self.screen.render_polygon(self._get_points(), color and color or self.color)
return self
def move_down(self):
self.top = min(display.height, self.top + self.speed)
self.bottom = min(display.height - verify_height, self.bottom + self.speed)
return self.top < display.height - verify_height
class Note(object):
def __init__(self, screen, index, color, speed):
self.color = color
self.left_slope = (top_anchor[index] != bottom_anchor[index]
and (display.height / float(bottom_anchor[index] - top_anchor[index]),)
or (0,))[0]
self.right_slope = (top_anchor[index + 1] != bottom_anchor[index + 1]
and (display.height / float(bottom_anchor[index + 1] - top_anchor[index + 1]),)
or (0,))[0]
self.trapezoids = []
self.verify = Trapezoid(screen, display.height - verify_height, display.height, index,
self.left_slope, self.right_slope, self.color, speed)
def move_all_trapezoids(self, pressed):
self.trapezoids = [trapezoid.render() for trapezoid in self.trapezoids if trapezoid.move_down()]
self.verify.render(pressed and self.trapezoids and self.trapezoids[0].bottom >= display.height - verify_height
and self.color or display.PINK)
class Visualizer(object):
def __init__(self, music_path="", fifo="", trapezoid_height=20, speed=2, on_tft=False, screen=None):
self.screen = screen and screen or display.Screen(on_tft=on_tft)
self.screen.clear()
self.screen.render_text({"Loading...": (160, 120)}, 40, display.WHITE)
self.screen.display()
self.cross_screen = (display.height - verify_height) / speed
self.notes = [Note(self.screen, 0, display.WHITE, speed),
Note(self.screen, 1, display.RED, speed),
Note(self.screen, 2, display.GREEN, speed),
Note(self.screen, 3, display.BLUE, speed)]
self.trapezoid_height = trapezoid_height
self.state = [0, 0, 0, 0]
self.speed = speed
if music_path and fifo:
self.fifo = fifo
subprocess.call(["mplayer -input file={} {} &".format(fifo, music_path)], shell=True)
subprocess.check_output("echo 'pause' > {}".format(fifo), shell=True)
def play_music(self):
subprocess.check_output("echo 'pause' > {}".format(self.fifo), shell=True)
def stop_music(self):
subprocess.check_output("echo 'quit' > {}".format(self.fifo), shell=True)
def map_file_refresh(self, frame, pressed):
self.screen.clear()
for i in range(4):
if frame >> 3 - i & 0x01:
if len(self.notes[i].trapezoids) and self.notes[i].trapezoids[-1].top == display.height:
self.notes[i].trapezoids[-1].top -= self.speed
else:
self.notes[i].trapezoids.append(Trapezoid(self.screen, -self.speed, 0, i,
self.notes[i].left_slope,
self.notes[i].right_slope,
self.notes[i].color, self.speed))
self.notes[i].move_all_trapezoids(pressed[i])
self.screen.display()
for pos in self.screen.get_click_pos():
print pos
return False
return True
def detection_refresh(self, frame, pressed):
self.screen.clear()
new_state = [0] * 4
for i in range(4):
if frame[i]:
# set state value to designated height for countdown
new_state[i] = self.trapezoid_height
if self.state[i]:
self.notes[i].trapezoids[-1].top -= self.speed
else:
self.notes[i].trapezoids.append(Trapezoid(self.screen, -self.speed, 0, i,
self.notes[i].left_slope,
self.notes[i].right_slope,
self.notes[i].color, self.speed))
elif self.state[i]:
self.notes[i].trapezoids[-1].top -= self.speed
new_state[i] = self.state[i] - 1
self.notes[i].move_all_trapezoids(pressed[i])
self.state = new_state
self.screen.display()
for pos in self.screen.get_click_pos():
print pos
return True
return False
class MapReader(object):
def __init__(self, music_path, map_path, pin_list, frame_rate, speed=2, on_tft=False, screen=None):
self.record = np.load(map_path)
self.pin_list = pin_list
keyboard.key_initiate(pin_list)
self.interval = 1.0 / frame_rate
self.visualizer = Visualizer(music_path, "mplayer_fifo", 0, speed, on_tft, screen)
def __call__(self):
timestamp = time()
display_start = time()
counter = 0
try:
for idx in range(len(self.record)):
counter += 1
if counter == self.visualizer.cross_screen:
self.visualizer.play_music()
if not self.visualizer.map_file_refresh(self.record[idx], keyboard.key_status(self.pin_list)):
break
sleep_time = timestamp + self.interval - time()
if sleep_time > 0:
sleep(sleep_time)
timestamp += self.interval
except KeyboardInterrupt:
pass
finally:
self.visualizer.stop_music()
keyboard.key_clean()
print("Elapsed time: {:.2f}s".format(time() - display_start))
if __name__ == "__main__":
map_reader = MapReader("peace.mp3", "peace&love.npy", [26, 13, 6, 5], 60)
map_reader()
--analyze.py is built on the top of visualize.py.
It includes class Analyzer, which is used to read the map file that stores beats, and display them using Visualizer. If no map file is found, it will import the Essentia library to do beat detection, store the result into a map file, and then display beats.
Class Visualizer and Analyzer are designed to be very easy to use. The instances of them are callable after initialization.
All functions, including the time control, playing music and refreshing the screen, will be executed at the designated frame rate until the screen is touched.
The usages of them are shown after “if __name__ == "__main__":” in both files.
import os
import multiprocessing
import numpy as np
import visualize
import keyboard
from time import time, sleep
# https://stackoverflow.com/a/46266853/7873124
# put declaration outside of the Analyzer class
# multiprocess cannot pickle an undefined function
def detect_onset(audio, index):
# should be able to fetch the module from cache
import essentia.standard as ess_std
from essentia import array
print("Subprocess {} starts".format(index))
processing_start = time()
onset_detector = ess_std.OnsetDetection(method="complex")
window = ess_std.Windowing(type="hann")
fft = ess_std.FFT()
c2p = ess_std.CartesianToPolar()
onsets = ess_std.Onsets()
frames = []
for frame in ess_std.FrameGenerator(audio, frameSize=1024, hopSize=512):
mag, phase = c2p(fft(window(frame)))
frames.append(onset_detector(mag, phase))
onsets_array = onsets(array([frames]), [1])
print("Subprocess {} finished. Elapsed time: {:.2}s".format(index, time() - processing_start))
return onsets_array
class Analyzer(object):
def __init__(self, music_path, sample_rate, frame_rate, least_energy, pin_list, response_time,
speed=2, on_tft=False, screen=None):
keyboard.key_initiate(pin_list)
self.pin_list = pin_list
self.frame_rate = frame_rate
self.time_interval = 1.0 / frame_rate
self.least_energy = least_energy
self.sample_rate = sample_rate
self.visualizer = visualize.Visualizer(music_path, "mplayer_fifo",
int(speed * frame_rate * response_time), speed, on_tft, screen)
stats_file = "{}.npz".format(os.path.splitext(music_path)[0])
# if the music has not been processed before
if not os.path.isfile(stats_file):
print("Loading Essentia module...")
import essentia.standard as ess_std
print("Loading music...")
loader = ess_std.MonoLoader(filename=music_path)
audio = loader()
pool = multiprocessing.Pool()
num_process = multiprocessing.cpu_count()
segment_length = int(len(audio) / num_process) / 1024 * 1024
onset_collector = []
self.num_frames = len(audio)
print("Calculating onsets...")
processing_start = time()
results = [None] * num_process
for i in range(num_process):
results[i] = pool.apply_async(detect_onset, args=(
audio[segment_length * i: min(segment_length * (i + 1), len(audio))], i))
pool.close()
pool.join()
for i in range(num_process):
onsets = results[i].get() + i * float(segment_length) / self.sample_rate
onset_collector += onsets.tolist()
onset_collector.append(np.finfo(float).max) # so that read_onset will never reach len(self.onsets)
self.onsets = np.array(onset_collector)
print("Onset detection finished. Elapsed time: {:.2f}s".format(time() - processing_start))
np.savez(os.path.splitext(music_path)[0], num_frames=np.array([self.num_frames]), onsets=self.onsets)
else:
stats = np.load(stats_file)
self.num_frames = stats["num_frames"][0]
self.onsets = stats["onsets"]
print("Pre-processing skipped")
def __call__(self):
timestamp = time()
display_start = time()
read_onset = 0
counter = 0
try:
for i in xrange(int(self.num_frames / float(self.sample_rate) * self.frame_rate)):
counter += 1
if counter == self.visualizer.cross_screen:
self.visualizer.play_music()
frame = [0] * 4
if time() - display_start > self.onsets[read_onset]:
read_onset += 1
frame[np.random.randint(0, 4)] = 1
clicked = self.visualizer.detection_refresh(frame, keyboard.key_status(self.pin_list))
if not clicked:
sleep_time = timestamp + self.time_interval - time()
if sleep_time > 0:
sleep(sleep_time)
timestamp += self.time_interval
else:
break
except KeyboardInterrupt:
pass
finally:
self.visualizer.stop_music()
keyboard.key_clean()
print("Elapsed time: {:.2f}s".format(time() - display_start))
if __name__ == "__main__":
analyzer = Analyzer("locked.mp3", 44100, 60, 0, [26, 13, 6, 5], 0.05)
analyzer()
--main.py is built on the top of display.py, visualize.py and analyze.py.
It shows the cover page for the user to choose which music to play, and then call either class MapReader or Analyzer to read corresponding map file and display the rhythm.
import display
from visualize import select_pos, MapReader
from analyze import Analyzer
from time import sleep
select_text = ["Select a music", "Peace & Love", "Locked Away", "July"]
select_music = {k: v for (k, v) in zip(select_text, select_pos)}
screen = display.Screen(on_tft=True)
all_pin = [5, 6, 13, 26][::-1]
frame_rate = 60
response_time = 0.05
def display_select_music():
screen.clear()
screen.render_text(select_music, 40, display.WHITE)
screen.display()
display_select_music()
running = True
while running:
pos = screen.get_click_pos()
if pos:
if pos[0][1] < 60:
running = False
else:
if pos[0][1] < 120:
map_reader = MapReader("peace.mp3", "peace&love.npy", all_pin, frame_rate, screen=screen)
map_reader()
elif pos[0][1] < 180:
analyzer = Analyzer("locked.mp3", 44100, frame_rate, 0, all_pin, response_time, screen=screen)
analyzer()
else:
analyzer = Analyzer("july.mp3", 44100, frame_rate, 0, all_pin, response_time, screen=screen)
analyzer()
display_select_music()
sleep(0.01)