Categories
Various

Akira Predicted the 2020 Tokyo Olympics

The animation movie Akira correctly predicted the 2020 Tokyo Olympics. The message in this billboard says “147 days for the start of the Olympics”:

2020 Tokyo Olympics

Akira’s action is set in the year 2019 and the final scenes take place in the Olympics stadium built for the 2020 Olympics.

Akira Tokyo Olympics

Let’s hope Third World War doesn’t break out and that Akira doesn’t destroy the stadium. The movie also predicted a radioactivity problem, but as a consequence of war and not because of a Tsunami hitting Fukushima.

A couple of Akira fan remixes about the 2020 Tokyo Olympics.

Akira Tokyo Olympics

Akira Tokyo Olympics

Via @kotecinho and Kotaku

Categories
Various

Microsoft Excel Art

Tatsuo Horiuchi is a 73 year-old artist that uses Microsoft Excel as a tool to create his works of art. He started his hobby when he retired and chose Excel because it was a cheaper software than graphic editors like for example Adobe Photoshop. Moreover, according to Tatsuo Horiuchi, Excel has more drawing tools than some graphic editors like Microsoft Paint. These are some of his creations:

Excel Art

Excel Art

Excel Art

Excel Art

If you want to see how is this possible, here you have a video of a Gundam created using Excel.

Source: Aramatheydidnt

Categories
Various

Kyoto Scientists Visualize People's Dreams

A team of researchers lead by Yukiyasu Kamitani has been able to decipher the contents of the dreams of three people.

To be able to achieve this feat, first they had to collect the data of the dreams of three volunteers by using fMRI and EGG/EOG/EMG/ECG scanners while they were sleeping and also while they were awake. They also wrote down the impressions of each volunteer after waking up, what they said they had been dreaming about. Next, they proceeded to classify as visual the dreams with at least one visual element and they assigned a name in English to each dream using Wordnet. From there, they built data vectors using each concept as the index and they used diverse automatic learning techniques based on support vector machines. The fascinating thing is that they were able to make the learning converge for the three volunteers and then they could use what was learned by the algorithms to be able to know what each person was dreaming before waking up.

To make it even more impressive they decided to map the words of each visual element dreamed by the volunteers with images extracted via Google Images. And that is how images very similar to a person’s dreams can be shown. For example, this is an image of the dream interpretation of subject 2 while he is dreaming:

This is what the subject said when he woke up, he was dreaming about characters:

“What I was just looking at was some kind of characters. There was something like a writing paper for composing an essay, and I was looking at the characters from the essay or whatever it was. It was in black and white and the writing paper was the only thing that was there. And shortly before that I think I saw a movie with a person in it or something but I can’t really remember.”

visualizing dreams
Here the volunteer was dreaming about people (male and female)


They compared the data when they were awake.


This is how the data vectors look like.

Source: Neural Decoding of Visual Imagery During Sleep