Linked List
Sorry, the linked list hasn't been optimized for mobile yet.Click on an entry, then scroll down to see preview or to continue to full article.

Current Exhibitions
see all
Get it on GitHub
Also, don't take this too serious and don't forget that there are no technological solutions for social problems.
Ein Automatisierungsdrama in drei Akten
3. Akt: In letzter Konsequenz
2020-06-19 19:00 – 2020-06-21 19:00
every 20 minutes at :00, :20, :40
inside U Karl-Marx-Straße, close to the south-east exit (direction Rudow)
more information in the schedule of 48 Stunden Neukölln
Die Bühnenadaption der Software "Wohnungsbot" (Download unter wohnungsbot.de) verhandelt anhand der Wohnungssuche in Berlin die Möglichkeiten und Konsequenzen des Versuchs soziale Probleme mit technischen Mitteln zu lösen.
The stage adaptation of the software "Wohnungsbot" negotiates the possibilities and consequences of the attempt to solve social problems with technical means.
with
Kasperle,
Prinzessin,
Krokodil,
Wohnungsbot
spoken by
Monika Freinberger (Kasperle, Prinzessin, Krokodil)
Marlene [AWS Polly] (Wohnungsbot)
Copy editing
Christopher Heyder
Production
Ortrun Bargholz
Hallo, ich bin das Kasperle.
Ich möchte euch heute eine Geschichte erzählen. Sie wirkt vielleicht ein bisschen wie ein Märchen, denn viele wünschen sich ja, dass die Digitalisierung ein Märchen mit schön eindeutigen Verhältnissen wäre. Aber am Ende seht ihr, dass die Sache nicht so einfach ist.
Lecture at 36C3
Wohnungsbot: An Automation-Drama in Three Acts — A media-art project which automates the search for flats in Berlin and challenges automation narratives
30+10 minutes, English (German and Russian translations available)
Download and more information at wohnungsbot.de.
No! It's just automation at work for an individual — which seems to be something that we're so unused to that it seems like a hack.
1 1992, see it among other works in WORDS-SLOGANS on his website
2 For an excellent discussion of this issue (and many other housing/gentrification topics, partially specific to Berlin) listen to Andrej Holm in Alternativlos, Folge 40 [de].
3 be.berlin campaign by dan pearlman Group, implemented following the official guidelines.
It is driven by a script scouring "a popular apartment listing website", generating the postcards in realtime within seconds of the listing going online. A sound notification is played, urging you to take action.
Now! Time to become a succesful artist!
Was du willst: Credits
What you want (Was du willst) shows you what you want! A computer-generated voice speaks to you: "Therefore I would like to invite you to calm down for a moment and ask yourself: What do you really want?"
The multi-part installation guides the visitors through a highly orchestrated system of experiences. The system embraces the entire exhibition space: after registering, the 'user' is always in some relation and under control of the system. You wait, you are called, you lie down, you get up, you move on, you take your receipt. Everything is tracked, every step is based on the identification of the barcode obtained when registering. How long do you wait after registration until being called, until checking in for the first time? How long do you lie on the bean bag, your eyes covered with what looks like a VR-headset but is just darkness, your ears covered with an infinite low-intensity sound scape generator? How long do you stand in the generated virtual environment, what do you look at, for how long and what categories of items are being shown to you in response? How do you react to a statistical evaluation of your experiences, but also of your personality?
Credits →
<emphasis><lang xml:lang="de-DE">Was du willst</lang></emphasis> asks the number <emphasis><say-as interpret-as="digits">%s</say-as></emphasis> to approach part 1, station <say-as interpret-as="character">%s</say-as>.<break /> Number <emphasis><say-as interpret-as="digits">%s</say-as></emphasis> please.
This is the last call by <emphasis><lang xml:lang="de-DE">Was du willst</lang></emphasis> for number <emphasis><say-as interpret-as="digits">%s</say-as></emphasis> <break /> Please immediately proceed to part 1, station <say-as interpret-as="character">%s</say-as>.<break /> Last call for <emphasis><say-as interpret-as="digits">%s</say-as></emphasis>
Please take the virtual reality headset and the headphones from the position marked with the letter <break time="0.1s" /> <say-as interpret-as="character">%s</say-as>.<break/> It's easier if you put on the headset first and then the headphones. Please make yourself comfortable on one of the bean-bags.
Welcome to part 1 of <emphasis><lang xml:lang="de-DE">Was du willst</lang></emphasis><break time="1s" /> I am happy to have you here.<break /> Please lay down on one of the bean-bags and make yourself comfortable. In the next 5 to 10 minutes I want to expose you to a stimuli reduction. I would kindly ask you not to take off the headset or the headphones until I ask you to do so.<break time="2s" /> I am sure many things have already happened to you on this day. Therefore I would like to invite you to calm down for a moment and ask yourself: <emphasis>What do you really want?</emphasis>
Hello %s,<break /> Part 2 is waiting for you. There will be everything you want.<break /> Please take off the headphones and the headset now and put them back to the position marked with the letter <break time="0.1s" /> <say-as interpret-as="character">%s</say-as>.<break/> Don't forget to scan your barcode at the next computer before putting on the other headset.
We're about to start. Take your time to adjust the headset so that it sits comfortably and make sure the earpieces sit tight.
Hello {username}
Welcome to your 3 <phoneme alphabet="ipa" ph="diːː">D</phoneme> world.
Here everything just evolves around you, {username}.
I see you like {0}? No problem. I'm glad actually. I have a lot of that for you.
Glad you like it here. You can stay as long as you wish.
I'm also a big fan of {0}. We're a good match, {username}!
You don't like {0}? I'm sorry for that. But I'll find something else for you, don't worry. Out there I have hundreds of objects just waiting to amaze you.
Look, here comes {0}. It's also {1} - you do like that {username}, don't you?
This is something that would suit you: {0}
You can't get enough {username}. But don't worry, it will go on forever.
A world full of {0} just for {username}. That would be quite something, no?
Do you also ask yourself, why there are so many weapons here? Weird, isn't it? Somehow humans seem to like that.
{username}!<amazon:breath duration="long" volume="x-loud"/>Please return to the center!
You don't need to move, everything you want will come to you.
First, if immersion is a desirable state, is it to be achieved by overwhelming every sense? VR supposedly does so and tries to leave no room for distraction. Or should one pursue immersion through the mental capacity of entering a 'flow' state or even meditation, thus reducing external stimuli?
Second, adaptive digital environments cater our psychological structures and needs ever better. These systems have reached a point - such as a game automatically adjusting its difficulty to the player - where they represent a more wantable life than our society can offer for many.
Third, digital recommendation systems claim to 'know' what you want through technological means of data-observation. VR is inherently based on tracking. This data in combination with statistical tools is used to create a questionable 'understanding' of us.
These approaches are contrasted in the three parts of the experience. In the end the question maybe isn’t what you want, but what 'wanting' means in the age of surveillance capitalism and adaptive virtual environments.
[news.met.police.uk, emphasis added]
In completely unrelated news: IMF Executive Board Approves US$4.2 Billion Extended Fund Facility for Ecuador (Press Release No. 19/72 - March 11, 2019)
Memo to self: LUKS + LVM + chroot + update-initramfs
As I will for sure skrew up things on my machine again sooner or later, here is a slightly nerdy memo to my future self.- It assumes an LUKS-encrypted NVMe with LVM where the first partition is efi, then boot, then root.
-
As my two NVMe drives enjoy swapping their
/dev/nvmeX
identifiers the first line checks for the identifier based on the drive's UUID (which you could get fromblkid
). -
The identifier chosen for
cryptsetup
has to match the/etc/crypttab
of the machine your chroot-ing into. -
update-initramfs -u
is not enough.
# get right NVMe decive IFS=':'; DEV=(`sudo blkid | grep "UUIDv4"`); unset IFS; echo "Detected NVMe ${DEV}" # decrypt drive -- last identifier has to match the target's /etc/crypttab! sudo cryptsetup luksOpen "${DEV}p3" root_crypt # mount from mapper (not the identifier above!) sudo mount /dev/mapper/ubuntu--vg-root /mnt # mount all the things sudo mount --bind /dev /mnt/dev sudo mount --bind /sys /mnt/sys sudo mount --bind /proc /mnt/proc sudo mount --bind /run /mnt/run # or shorter: for i in /dev /dev/pts /proc /sys /run; do sudo mount -B $i /mnt$i; done # set up DNS sudo mount -o bind /etc/resolv.conf /mnt/etc/resolv.conf # mount /boot sudo mount "${DEV}p2" /mnt/boot # mount EFI (wasn't necessary so far) # sudo mount "${DEV}p1" /mnt/boot/efi echo "Device mounted and ready to chroot. Run:" echo "sudo chroot /mnt /bin/bash"
Now inside the machine it should be possible to
update-initramfs -c -k all
.
35C3 - Day 4 - 14:00 - Room 11 (CCL, floor 2)
A self-organized session to present and discuss People who write code at 35C3!More information at code.neopostmodern.com

(Lutz also did computer poetry. Back in 1959. On a Z22. With 3kHz and 38kB drum memory. Read it here.)
Kubshow #40
In November 2018 Jakob Wierzba invited me to the 40th episode of his radio show "Kubshow" to talk about Menschen die Code schreiben. We were joined by Tristan Schulze.Listen now: Kubshow #40 on Mixcloud [2 hours, 4 minutes and 46 seconds; German]
Mehr CSU-Wähler sind verstorben als zu irgend einer anderen Partei gewechselt.
Ableton Live Set Real Export
Assume you want to be able to export your Ableton project's arrangement into a machine readable format to... do some things that machines are good at.You start by opening a
.als
file
in a text editor and - if your computer doesn't crash - get to see a lot of random characters.Luckily, Ableton recently released a tool called Ableton Live Set Export.
It does exactly not what the name says:
The library only contains functionality for generating Ableton Live projects. It does not support reading or parsing Live Sets or other Ableton-generated files.Sounds like Alice, doesn't it?
But the internet always has answers.
It turns out that
.als
is just a gz
-compressed proprietary XML format - and that is pretty easy to use.
So, step one, rename your .als
to .als.gz
, decompress it and save the file as .xml
.Step two, build a little Python (3.6) script - using the help of lxml - and get the data you want!
Caveats (as found so far)
Some weird things when working with Ableton live set files:- Time is measured in beats. This kind of makes sense, but is also really annoying if you want to interact with other software.
The code
from lxml import etree import math tree = etree.parse('ableton.xml') root = tree.getroot() project_bpm = float(root.find('.//Tempo/Manual').get('Value')) def beats_to_seconds(beats, bpm): return beats * 60 / bpm def seconds_to_time_string(seconds): milliseconds = round((seconds % 1) * 1000) seconds = round(seconds) minutes = math.floor(seconds / 60) hours = math.floor(minutes / 60) minutes = minutes % 60 seconds = seconds % 60 return f'{hours:02d}:{minutes:02d}:{seconds:02d}.{milliseconds:03d}' def beats_to_time_string(beats, bpm): return seconds_to_time_string(beats_to_seconds(beats, bpm)) tracks_xml = root.findall(".//AudioTrack") tracks = {} for track_xml in tracks_xml: track_id = track_xml.get('Id') track = { 'id': track_id, 'name': track_xml.find('Name/UserName').get('Value'), # it's handy to keep an XML element reference while debugging, but not necessary for 'production' '_xml': track_xml } clips_xml = track_xml.findall('.//AudioClip') clips = [] for clip_xml in clips_xml: clip_path = '/' + '/'.join([path_xml.get('Dir') for path_xml in clip_xml.findall('.//SearchHint/PathHint/RelativePathElement')]) + '/' clip = { 'name': clip_xml.find('Name').get('Value'), 'file': clip_xml.find('SampleRef/FileRef/Name').get('Value'), 'path': clip_path, 'start_on_track': beats_to_seconds(float(clip_xml.find('CurrentStart').get('Value')), project_bpm), 'end_on_track': beats_to_seconds(float(clip_xml.find('CurrentEnd').get('Value')), project_bpm), 'start_in_file': beats_to_seconds(float(clip_xml.find('Loop/LoopStart').get('Value')), project_bpm), 'end_in_file': beats_to_seconds(float(clip_xml.find('Loop/LoopEnd').get('Value')), project_bpm), # it's handy to keep an XML element reference while debugging, but not necessary for 'production' '_xml': clip_xml } clips.append(clip) track['clips'] = clips tracks[track_id] = track # use this to find the IDs of the tracks you're interested in # I don't think you can guess these from inside Ableton # for track_id, track in tracks.items(): # print(track_id, track['name']) track_id_filter = ['144'] for track_id in track_id_filter: track = tracks[track_id] print("\n" + track['name']) for clip in track['clips']: print(f"{seconds_to_time_string(clip['start_on_track'])} - {seconds_to_time_string(clip['end_on_track'])} " f"| {clip['name']:20} | {clip['file']} (" f"{seconds_to_time_string(clip['start_in_file'])} - {seconds_to_time_string(clip['end_in_file'])})")
Sample output
I've redacted the actual values, but you get the idea:TRACK NAME 00:00:00.000 - 00:00:02.500 | CLIP NAME | FILE NAME (00:00:10.500 - 00:00:13.1000) 00:00:03.508 - 00:00:04.027 | CLIP NAME | FILE NAME (00:02:59.625 - 00:03:00.144) 00:00:04.050 - 00:00:06.131 | CLIP NAME | FILE NAME (00:00:21.479 - 00:00:24.560) 00:00:06.131 - 00:00:07.335 | CLIP NAME | FILE NAME (00:00:09.459 - 00:00:11.663) 00:00:07.335 - 00:00:10.595 | CLIP NAME | FILE NAME (00:00:28.572 - 00:00:30.831) 00:00:10.595 - 00:00:12.998 | CLIP NAME | FILE NAME (00:00:15.439 - 00:00:18.842)The first time range indicates the place within your arrangement, whereas the latter one (in brackets) refers to the portion of the source file being used.
Ableton, Ableton Live and some other words in this text are probably protected by copyright and therefore (somehow) belong to someone or something. Also, Ableton hasn't publicly encouraged people to hack there file format so I assume this isn't endorsed by them.
Art school
as it happened.How do you?
(or: What the internet really wants to know is slime)
Slime | Babies | Food | |
English | 3rd | (none) | 0% |
German | 1st - 4th | (none) | 40% |
French I | 1st, 4th, 10th | (none) | 40% |
French II | 1st, 4th, 5th | 2nd, 3rd | 20% |
Spanish I | 2nd | (none) | 80% |
Spanish II | 2nd | 8th | 50% |
Portuguese I | 1st | (none) | 70% |
Portuguese II | 1st | (none) | 80% |
Two projectors fill the room with real-time and continuously updating images, taken from webcams all over the world. In the center there are two fragmented screens, taking on the stripes employed by Rüdiger Schöll, compositing uncanny scenes which at the same time create a strong sense of meaning as well as being pure technical artifacts without intention.
Pixels bend on the curved shapes of the vault, blending technical infrastructure with a historic monument.
12047 Berlin
Why do you write code?
How does it affect you, that you write code?
How does it affect others, that you write code?
The installation tries to find a way to make their answers accessible, as well as questioning itself and the meaning/relevance of the questions.
It is also one of many measures to show that there was an opinionated curatorial process determing which fragments have been selected or how they were grouped.
A very long (although not infinite) list of questions I've posed myself during the process of preparing, conducting, and editing the interviews, as well as the installation itself - printed on continous form paper.
[...]
Ist die Frage "Warum schreibst du Code?" zu oberflächlich?
Sind die Fragen zu generisch um Menschen in einen wirklichen Reflexionsprozess zu lenken? Also sind am Ende vielleicht die Art und Formulierung mehr für die (mangelnden) Antworten verantwortlich als die (mangelnde) Reflektion der Antwortenden?
Wie beinflusst es die Interviewees in ihren Antworten je nach dem ob sie mich eher als Informatik-Student oder als Kunst-Student wahrnehmen?
Wie stark beeinflusst der persönliche Vertrautheitsgrad die Antworten?
Wie viel geht durch die Verwendung des convenience samples an möglichen Antworten / Bandbreite / Aussagen verloren?
[...]
The installation tries to deconstruct it's own methodology as far as possible. By showing how the material was organized, labeled, and what terminology was employed the viewers are empowered to question whether they agree with the structure and what its limitations are.
Further this allows observing clusters and tendencies of which topics are talked about most and in what combination without reverting to a classical quantitative analysis.
The scaffolding, as a mobile, temporary structure came into the start-up space as a questioning visitor.
This was aimed to on the one hand reach people who wouldn't usually enter a gallery and on the other hand to remain in constant (visual) contact with the subject of the installation: people who write code.
Blender cheat sheet
A very personal collection of shortcuts and utilities I keep forgetting. No intent of completeness.3D View
Selection
A | All/none (2.7) selection |
2.8 A A | None selection |
B | Box select |
Ctrl +I | Invert selection |
Shift +RMB | Set active object |
Shift +S | Cursor to selected (and others) |
Viewing/Modes
/ | Toggle "local" view (isolated) |
2.7 Z | Wireframe ↔ Rendered |
2.8 Z | Select render-style (wireframe, rendered, solid, ...) |
Tab | Object ↔ Edit |
2.7 Ctrl +Tab 2.8 1 2 3 | Select Vertex/Edge/Face mode |
Shift +Tab | Toggle snapping |
Ctrl +Shift +Tab | Select snapping mode |
Organizing
Ctrl +P | Set Parent |
Alt +P | Clear Parent |
Ctrl +Alt +G | Remove from group |
Editing
S X X | Scale local X |
P | Selection to separate |
2.7 W | ⇒ Subdivide |
Ctrl +A | Apply transformation |
Ctrl +Shift + Alt +C | Origin to center of mass/surface/volume |
Hierarchy View
Selection
Ctrl +LMB | Perform action with all children (select, toggle visibility, ...) |
. (NumPad) | Reveal selection |
Freizeit & Internet 1
- waiting years to receive a car you ordered, to find that it's of poor workmanship and quality
- promises of colonizing the solar system while you toil in drudgery day in, day out
- living five adults to a two room apartment
- being told you are constructing utopia while the system crumbles around you
- 'totally not illegal taxi' taxis by private citizens moonlighting to make ends meet
- everything slaved to the needs of the military-industrial complex
- mandatory workplace political education
- productivity largely falsified to satisfy appearance of sponsoring elites
- deviation from mainstream narrative carries heavy social and political consequences
- networked computers exist but they're really bad
- Henry Kissinger visits sometimes for some reason
- elite power struggles result in massive collateral damage, sometimes purges
- failures are bizarrely upheld as triumphs
- otherwise extremely intelligent people just turning the crank because it's the only way to get ahead
- the plight of the working class is discussed mainly by people who do no work
- the United States as a whole is depicted as evil by default
- the currency most people are talking about is fake and worthless
twitter.com (Thread)
reagan-paintings.com
Wahlvorurteile, Teil I
CDU | AfD | Linke | SPD | FDP | Grüne | (+) | (++) |
0.00% | 0.00% | 35.71% | 10.71% | 7.14% | 39.29% | 0.00% | 7.14% |
0.00% | 0.00% | 31.25% | 0.00% | 6.25% | 15.63% | 31.25% | 15.63% |
0.95% | 2.86% | 28.57% | 9.52% | 0.00% | 47.62% | 9.52% | 0.95% |
5.26% | 0.00% | 26.32% | 21.05% | 0.00% | 26.32% | 21.05% | 0.00% |
0.00% | 0.00% | 35.00% | 0.00% | 5.00% | 10.00% | 40.00% | 10.00% |
5.00% | 0.00% | 40.00% | 15.00% | 5.00% | 35.00% | 0.00% | 0.00% |
5.26% | 5.26% | 26.32% | 10.53% | 10.53% | 26.32% | 15.79% | 0.00% |
1.00% | 0.00% | 32.00% | 12.00% | 5.00% | 15.00% | 30.00% | 5.00% |
0.00% | 16.67% | 50.00% | 0.00% | 0.00% | 0.00% | 33.33% | 0.00% |
22.22% | 11.11% | 33.33% | 22.22% | 5.56% | 5.56% | 0.00% | 0.00% |
4.00% | 0.00% | 40.00% | 16.00% | 0.00% | 30.00% | 10.00% | 0.00% |
0.00% | 6.67% | 26.67% | 13.33% | 20.00% | 33.33% | 0.00% | 0.00% |
0.00% | 0.00% | 52.63% | 7.89% | 2.63% | 26.32% | 5.26% | 5.26% |
5.00% | 0.00% | 27.50% | 17.50% | 5.00% | 22.50% | 20.00% | 2.50% |
0.00% | 0.00% | 82.00% | 8.00% | 0.00% | 5.00% | 5.00% | 0.00% |
5.00% | 0.00% | 60.00% | 5.00% | 0.00% | 30.00% | 0.00% | 0.00% |
0.00% | 0.00% | 60.00% | 20.00% | 2.00% | 10.00% | 3.00% | 5.00% |
0.00% | 0.00% | 35.29% | 26.47% | 2.94% | 11.76% | 8.82% | 14.71% |
0.00% | 0.00% | 50.00% | 10.00% | 5.00% | 30.00% | 5.00% | 0.00% |
0.00% | 0.00% | 30.77% | 10.26% | 2.56% | 25.64% | 30.77% | 0.00% |
20.00% | 0.00% | 30.00% | 30.00% | 0.00% | 20.00% | 0.00% | 0.00% |
(+) Didn't vote (++) Not entitled to vote
Thus, I repeated the poll: I asked 38 of the seminar's students which party they had voted in Germany's 2017 election (Zweitstimme).
A month later, framed as a prize draw, I asked them to guess what their peers had declared to have voted in the poll (because who knows if they lied?).
Here are the unfiltered results.
Looking for a flat in Berlin... here are the numbers.
Criteria
≥ 2.0 rooms
≤ 900€ rent (+ additional costs / heating)
≥ 45m²
Not parterre
12043, 12047, 12049, 12051, 12053, 10967, 10961, 10997, 10999, 10119
Donnerstag 18:00 Uhr - 24:00 Uhr
Freitag 11:00 Uhr - 22:00 Uhr
Samstag 11:00 Uhr - 22:00 Uhr
Sonntag 11:00 Uhr - 20:00 Uhr
Mom, look, I'm a real developer now! I've built my first Android app...
How to Structure
This is a rough user manual for Structure. Read the announcement for v0.15.2 (Abstractionism) here. The most recent version of Structure isSetup
-
Download the latest version
appropriate for your plattform.
(Installing it on macOS is weird, because Apple doesn't like
independentnon-macOS developer's software). - If you don't have one, create a GitHub account. For now this is unfortunately the only available log-in mechanism for Structure.
- Start Structure, log in and bookmark things.
Keyboard shortcuts
Key combinations are not customizable, yet (#7)Key combination | Action |
---|---|
[CTRL/CMD] + [n]
|
Create new entry |
[CTRL/CMD] + [t] |
Add tag (only available when viewing a single entry) |
[CTRL/CMD] + [.] or [Esc] |
Back to default view (list of links/notes) |
[CTRL/CMD] + [f] |
Search links/notes |
How do I...
...create a text note? - go toAdd new
(or [CTRL/CMD] + [n]
) and click on the Need just text?
below the URL field...untag? - right click on the tag
...see all links/notes with a tag? - simply (left) click on the tag
...change the color of a tag? - as above (left) click on the tag, then click inside the big color square and change the value (either a hex-code like #00FF00 or a CSS color name like red or navy)
...see all my tags? - o to your user-page (click your username in the top right corner) and then
My tags
...navigate the tag-autocomplete with the keyboard? - down/right for the next tag, up/right for the previous
...use the bookmarklet? - go to your user-page (click your username in the top right corner), generate a bookmarklet token, and copy the bookmarklet-code into a browser bookmark (make sure the
javascript:
bit gets pasted)...create an RSS-feed? - just like the bookmarklet, generate a token on your user-page. Caution: someone with access to your RSS-feed can see all your saved links (but currently not text-notes) and comments!
I describe Structure as "A performance-at-interaction oriented and reasonably stylish bookmarking tool for (eventually) everything."
It does a few of the things I was lacking in other bookmarking tools:
- fuzzy organization (multi-tagging instead of hierarchy, title and comment field, everything fully searchable)
- scalable (it should handle 5+ years of bookmarks... so 10k+)
- keyboard-interaction focused (although some details are currently not keyboard-accessible)
- not ugly
- targets everything (URLs/websites, text notes, local files1)
- privacy
- self-determination (export1 your data and leave)
What it doesn't do:
- perfect privacy (it does HTTPS but no client-side encryption - either you trust me or you run your own instance. also it relies on GitHub OAuth2 for authentication due to laziness about rolling an own authentication system)
- tons of apps and integrations (there is a proof-of-concept Android app though)
And it will do a few things I was asked for:
- screenshots1 (for the visual among us)
- snapshots1 (of source code - disappearing and changing websites anybody?)
1 Coming soon, as of Jan 2018. Have a look at the issues on GitHub.
You will need a GitHub account for authentication.
You're data will be stored on neopostmodern servers.
Jahresbericht 2017
This is my attempt to answer some of the nice messages I got in the last weeks. Although, using my camera very little makes recalling the events of this year a little harder. Notice: I'm using xkcd-style comments-in-alt-text here. Which means: Hover your mouse over the picture to read the comment.I've sent it to a bunch of festivals since but it hasn't attracted any love. Still it was an interesting experience to go through all the stages of producing a short film and actually finish something (which I haven't always been expert at in the past).
That's a dead link right now, but I'll upload some pictures soon.
Happy 2018!
34C3
- Ecstasy 10x yellow Twitter 120mg Mdma (Shipped from Germany for 0.1412554 Bitcoins)
!Mediengruppe Bitnik Highly recommended if you don't know their work yet, otherwise just what you expected
- Die göttliche Informatik Rainer Rehak
- Pointing Fingers at 'The Media' (The Bundestagswahl 2017 and Rise of the AfD)
- Social Cooling - big data’s unintended side effect
Tijmen Schep
- Robot Music (The Robots Play Our Music and What Do We Do?)
Jacob Remin & goto80 Don't expect any art-conceptual insights though
Performance, 2017
In this perfomance I put myself in the exhibition space for the opening wearing a custom-built eye tracker which showed in real-time where my gaze was directed, observable on screens attached to my augmented body.
Die Pose muß viel allgemeiner als fotografische Prägung des Körpers verstanden werden, derer sich das Subjekt nicht unbedingt bewußt ist: Sie kann das Resultat eines Bildes sein, das so oft auf den Körper projiziert worden ist, daß das Subjekt beginnt, sich sowohl psychisch wie auch körperlich mit ihm zu identifizieren. Dieses Bild ist im übrigen durchaus nicht immer schmeichelhaft oder lustvoll besetzt.«
2 modified webcams and IR light source (attached to a custom-built head mount)
power outlet from wall attached to body (25 meter cord)
runs a modified version of Pupil
The technological developments of the early twenty-first century added a new immediacy and impact to this. The gaze itself of course always had been immediate, observable by the parties involved. But it was limited to its physical environment. Audience measurement was rudimentary. Not so in the present day and especially the internet: Knowing who sees what when had grown beyond mass surveillance. Every user of social media gets near to real-time feedback on their performance.
Results vary, but the influence of the public telegaze has been strong enough for large shares of social media users turn to (supposedly) more oblivious and private ways of exposing themselves, such as Snapchat (and the inclusion of their features into other mainstream products). Silverman's words on unconsciously posing for the camera seem to fulfill even more today, taking into account just how often each individual is photographed (or photographing themselves). And on the other side the emergence of terms such as social cooling proves that users now feel, acknowledge and fear that even looking at something is an act with strings attached.
The performance Blickregime (sehen und gesehen werden) bridges the technological and social/physical space into a performance. This format creates an actual experience of gazing and being gazed upon, a fundamental advantage. Only in involuntarily forcing bystanders to take both sides it can be achieved to gain an understanding (and hence empathy) of the ubiquitous process, the social dynamic of the regime of the gaze. The performance initially has no claim to action: merely by existing it attempts to expose the existence of the regime. In echo to Minujín's work the opening was chosen as the ideal moment of observation. The multiple layers of sehen und gesehen werden (see and be seen) unfold: gallery visitors see artworks, gallery visitors see gallery visitors, gallery visitors see the performance, the performer sees gallery visitors and artworks and everybody can see exactly how the performance sees all this, inviting the visitors to extrapolate this visibility to their own gaze and that of their surrounding. A feedback system establishes itself. The gaze is felt, spatially, physically. And as a performer I am the first to notice the arising self-censorship. Moving your eyes loses it's unconscious innocence. The sensation stays, even after leaving the apparatus. You'll never look at anything the same way anymore.
Nominated for HGB Studienpreis 2017
At this very moment, one of the servers performing this work retrieves images from 24 randomly selected webcams. The webcams are located in the 24 time zones of the earth, one per time zone. The database contains between 60 and 100 active webcams.
The results are presented as a triptych (responsively, so on small devices they are stacked vertically). For each image, 3 to 7 layers are superimposed. The actual composition changes with every call of the page and can be forced with the "reload" button.
Thus, every moment that is fixed in its temporal dimension encompasses an immeasurably large number of combinations of its visual archiving. Several forms of time are negotiated: local (time zones, refresh and transfer rates), technical (webcam aesthetics) and historical (the moment in general — but especially internet — development, where it is [still] possible to access a large number of single-frame based webcams worldwide).
Exhibitions 2017
Site-specific VR-lecture, 2017
First lignite mine shaft, then conversion in the GDR, then dilapidation. What historical authenticity can such virtual experiences develop in the auratic context of a real system? Is the logic of representation limited by the medium? How do different representations — rough 3D scans, clean 3D models, a talk, poetry — as different kinds of prosthetic knowledge alter our acceptance of its history, as a heritage, and its existence as a "memory"? A polemic, drawing on Jorge Luis Borges' 1947 short story El Inmortal (The Immortal).
2D documentary video (10:18, German audio - Spanish subtitles available)
Credits
Camera and 3D scan boom: Julie Hart
Sound: Leon Naffin
3D model template: Bernd Mörsberger
3D scan software: BundleFusion
Thanks to: Leon Naffin, Ortrun Bargholz
Interactive installation with VR, 2017
With Virtual Reality this can be overcome, since vision is detached from the own body. Does this also detach and dissolve self-perception as known until today? Is this a first step towards trans-humanism?
It isn't. We recognize ourselves with even low-resolution, square-y, hollow representations of ourselves. But we don't associate with the body, can't move naturally, are awkwardly lost in overlapping physical and virtual spaces. The movement of the own body, the movement of others entering the light cube (a physical take on the chaperone usually used in VR to remind users of the physical limits) and the movement of the virtual camera are quite overwhelming.
Kowalski, M.; Naruniec, J.; Daniluk, M.: "LiveScan3D: A Fast and Inexpensive 3D Data Acquisition System for Multiple Kinect v2 Sensors". in 3D Vision (3DV), 2015 International Conference on, Lyon, France, 2015
The Artist's Guide to BundleFusion with a Kinect
Let's assume happen to have the right hardware (see below) and you'd like to work with BundleFusion. It sure looks cool.
Except, it's not so easy to get to work as it maybe promised. Here is my somewhat-more-in-depth (as compared to the README) setup process:
Pre-requisites
Hardware
- Kinect360 with the slightly overpriced USB 2.0 adapter or
KinectOne (aka. v2) with the overpriced USB 3.0 adapter(not) - Reasonable PC with a reasonable Nvidia GPU (don't ask me what that means)
- MSFT Windows 10 :(
Drivers
- Nvidia drivers (figure this out yourself, for my VR things I needed the most up-to-date available)
- CUDA 7.0 (do a custom install to preserve your shiny new drivers)
- MSFT Visual Studio 2013 (you can get that at their website after registration for free)
- for Kinect360
- Kinect SDK 1.8
- for KinectOne
- Kinect SDK 2.0
- the most recent drivers (which you get by doing this one weird trick -- no pun intended)
Don't bother thinking about using a recent CUDA or Visual Studio version. Just don't.
Downloading and setting up the code
If you use git
it might be more straight-forward, but since I never properly set up my MSFT Windows install, I downloaded the ZIPs directly from GitHub.
- Grab a ZIP of the master of the repo. Unpack it to
BundleFusion-master
. - Grab a ZIP of the mLib submodule at the correct revision. Confirm this by navigating to
/external/
. Unpack this and replaceBundleFusion-master/external/mLib
. - Grab the mLib external libraries from this Dropbox link they provide. This seems to be a bunch of dependencies or builds -- but no idea, really. Unpack and place next to
BundleFusion-master
.
This should leave you with the following folder structure. Double check this, it'll screw you up.
BundleFusion-master/
external/
mLib/ # this is the submodule you replaced
data/
src/
[...]
FriedLiver/
[...]
FriedLiver.sln
[...]
mLibExternal/ # you downloaded this from Dropbox
include
libsWindows
[...]
Adjusting the code for your needs
Open BundleFusion-master/FriedLiver/FriedLiver.sln
with Visual Studio 2013 (make sure it's really the 2013 version, Help > About Microsoft Visual Studio
). Open the file GlobalAppState.h
and change line 3 to 10 to the following to enable your Kinect(s):
#define KINECT
#define KINECT_ONE
//#define OPEN_NI
#define BINARY_DUMP_READER
//#define INTEL_SENSOR
//#define REAL_SENSE
//#define STRUCTURE_SENSOR
#define SENSOR_DATA_READER
Now switch to a production (Release
) build. The Debug
builds don't work for me, missing FreeImage.dll
. Check that your code and build config look somewhat like this:
Build
Go to Build > Build Solution
of CTRL + SHIFT + B
and hope for the best. If you get any errors, you're on your own now. Otherwise: Congratulations! You have now compiled BundleFusion, which is about 80% of the hastle to get to an actual 3D scan.
Configure
Copy the two files zParametersBundlingDefault.txt
and zParametersDefault.txt
from to BundleFusion-master/FriedLiver
to BundleFusion-master/FriedLiver/x64/Release
.
Open them with a reasonable editor (e.g. Notepad++, not the regular Notepad, it doesn't understand line-endings).
As you can see, there is a lot of things to configure. Trial and error FTW. Run it after every change to see if it helped.
First of all, in zParametersDefault.txt
in line 2 specify the sensor you're going to use. I only got the Kinect360 to work, so that means
s_sensorIdx = 0;
Another thing that was crucial for getting it to work for me was changing line 49 in zParametersDefault.txt
to
s_hashNumSDFBlocks = 100000; //smaller voxels require more space
because 100000 is suggested in a comment in the original file.
Try to run it
Start BundleFusion-master/FriedLiver/x64/Release/FriedLiver.exe
. If that crashes with
bundlefusion-master\friedliver\source\depthsensing\VoxelUtilHashSDF.h(135) : cudaSafeCall() Runtime API error 2: out of memory.
You'll want to go back to configuration and randomly change some numbers.
Shortcuts for interaction
Once you got the little preview window up and running, you'll quickly wonder how the hell you're supposed to interact with this software: Where's the colored view? And how do you export anything at all?
This bug hints at reading the source code (obvious place to look for shortcuts), specifically Source/DepthSensing/DepthSensing.cpp
, where starting at line 205 we find this:
g_pTxtHelper->DrawTextLine(L"Controls ");
g_pTxtHelper->DrawTextLine(L" \tF1:\t Hide help");
g_pTxtHelper->DrawTextLine(L" \tF2:\t Screenshot");
g_pTxtHelper->DrawTextLine(L" \t'R':\t Reset scan");
g_pTxtHelper->DrawTextLine(L" \t'9':\t Extract geometry (Marching Cubes)");
g_pTxtHelper->DrawTextLine(L" \t'8':\t Save recorded input data to sensor file (if enabled)");
g_pTxtHelper->DrawTextLine(L" \t'7':\t Stop scanning");
g_pTxtHelper->DrawTextLine(L" \t'6':\t Print Timings");
g_pTxtHelper->DrawTextLine(L" \t'<tab>':\t Switch to free-view mode");
g_pTxtHelper->DrawTextLine(L" \t");
g_pTxtHelper->DrawTextLine(L" \t'1':\t Visualize reconstruction (default)");
g_pTxtHelper->DrawTextLine(L" \t'2':\t Visualize input depth");
g_pTxtHelper->DrawTextLine(L" \t'3':\t Visualize input color");
g_pTxtHelper->DrawTextLine(L" \t'4':\t Visualize input normals");
g_pTxtHelper->DrawTextLine(L" \t'5':\t Visualize phong shaded");
g_pTxtHelper->DrawTextLine(L" \t'H':\t GPU hash statistics");
g_pTxtHelper->DrawTextLine(L" \t'T':\t Print detailed timings");
g_pTxtHelper->DrawTextLine(L" \t'M':\t Debug hash");
g_pTxtHelper->DrawTextLine(L" \t'N':\t Save hash to file");
g_pTxtHelper->DrawTextLine(L" \t'N':\t Load hash from file");
The most important keys are 9
(export) and 2
(color).
Look at the results
The exported .ply
files are stored in BundleFusion-master\FriedLiver\x64\Release\scans
and named scan.ply
, scan1.ply
et cetera. Work out a system for yourself.
Post-process
Have a look at this two part tutorial by Intel to get your model simplified, textured and (optionally) into Unity.
To be continued
Okay, I lied. Things aren't working super well and most likely you couldn't follow the instructions. But I hope you're a little further now.
User-generated crypto performance, 2014, 2016, 2017
Awarded with the Deutscher Multimediapreis mb21 for the 2017 annual theme Big Dada
(work in progress)
Quora questions
It's not always about answers. Art is all about questions.What are some industries where Machine Learning could be implemented but hasn't been yet?
Is it possible to get involved in machine learning and neural networks without a background in neuroscience?
What do we know about real neural networks that hasn't yet been applied in Machine Learning?
How "neural" are neural networks?
How can a neural network learn itself?
How do I learn neural networks?
Ich dachte es muss "Krieg oder Frieden" heißen
Max Ratay, Fabian Lehmann, Clemens Schöll
______ __ __ ______ ____ ____ ____ __ __ __ ____ _____ ______ ______ __ ____ __ __ ____ /_ __// / / // ____/ / __ \ / __ \ / __ \ / / / // / / _// ___//_ __/ / ____// / / __ \ / / / // __ \ / / / /_/ // __/ / /_/ // / / // /_/ // / / // / / / \__ \ / / / / / / / / / // / / // / / / / / / __ // /___ / ____// /_/ // ____// /_/ // /___ _/ / ___/ / / / / /___ / /___/ /_/ // /_/ // /_/ / /_/ /_/ /_//_____/ /_/ \____//_/ \____//_____//___/ /____/ /_/ \____//_____/\____/ \____//_____/
The revolution's technology needs infrastructure. But all current infrastructure represents powerstructures opposite to our goals.
What is more, all previous attempts to reappropriate this infrastructure – and it has to be stressed that it is crucial infrastructure – have failed. A proposal targeted at the code-literate, tech-savvy, ideology-driven leftist user can't possibly replace an all-encompassing structure; it will remain a utopia drifting ever further from the mainstream capitalist reality in terms of performance. We can't run the revolution on imperformant infrastructure.
A powerfull, people-owned, distributed infrastructure is needed. An infrastructure based on consensus, transparency, open source and open governance, that is: anarchist planning theory. And when governance is taken from us in open source areas we have to for and starve the tyranns of our ideas.
We need an inclusive, open to the public, to the not technically savvy – we need A POPULIST CLOUD.
Near-Fi short film, 2017
13:06, color (HD), stereo sound (PT/DE)
Written and shot in Portugal and Germany in 2015 / 2016 with a broken Nikon P5100 and a regular D7100
A young German's video camera breaks. He sees it as a sign of the disintegrating media scene and lost of trust in the news. After the right wing gains power in the 2017 elections, he flees to Portugal but remains unsure whether this was an adequate reaction. As he looks back, he asks himself if anything happened at all.
Shot winter 2015/2016. Curated winter 2016/2017.
002-KAMERA.wav
003-ORAKEL.wav
004-LÜGENPRESSE.wav
005-MEDIENREALITÄT.wav
006-DESCARTES.wav
007-UNVERÄNDERT.wav
008-WAHLEN.wav
009-RECHTE.wav
010-HASSREDEN.wav
011-WEGSCHAUEN.wav
012-FLUCHT.wav
101-SAUDADE.wav
101-SAUDADE-2.wav
102-GERAÇÃO.wav
102-GERAÇÃO-2.wav
102-103-GERAÇÃO-LUTAS-POLITICAS.wav
103-LUTAS-POLITICAS.wav
103-LUTAS-POLITICAS-2.wav
103-LUTAS-POLITICAS-3.wav
103-LUTAS-POLITICAS-4.wav
104-ISOLAÇÃO.wav
104-ISOLAÇÃO-2.wav
105-DEMOCRATICAMENTE.wav
105-DEMOCRATICAMENTE-A.wav
105-DEMOCRATICAMENTE-B.wav
106-DESPREZADO.wav
106-DESPREZADO-A.wav
106-DESPREZADO-B.wav
107-CULPADO.wav
107-CULPADO-2.wav
108-ELA.wav
108-ELA-2.wav
109-PERIGO.wav
109-PERIGO-A.wav
109-PERIGO-B.wav
109-PERIGO-C.wav
110-OHNMÄCHTIG.wav
110-OHNMÄCHTIG-2.wav
111-INACÇÃO.wav
111-112-INACÇÃO-PARALISADO.wav
112-PARALISADO.wav
112-PARALISADO-2.wav
113-ACUSAÇÕES.wav
113-ACUSAÇÕES-2.wav
200-MAR.wav 200-MAR-A.wav
200-MAR-B.wav
200-MAR-2.wav
200-MAR-3.wav
200-MAR-4.wav
PureData (pd-l2ork) and two small MIDI controllers
GUI froze about 15 minutes in, audio analysis crashed another 15 minutes later.
Visuals for Hugo Capablanca
04177 Leipzig
Bike frame, wood, leather
Created circa 2011, destroyed circa 2016
(2016)
A sudden gain of perspective.
Issued by: sitting on the side walk, starring into the wet streets, the yellow light reflected on the darkness of the night, on your birthday. 00:49AM all alone, between one group of friends and the other, ultimately alone, as always.
IMMUNE SYSTEM
As a kid, I used to lick the handrails leaving the subway stations because I assumed it strengthened my immune system.
As a teenager, I had only shitty boyfriends, because I heard I'd eventually build up character.
As an adult – so far – I've been living in awful places, pursuing degrading jobs, because I hoped it would train my aesthetic and prepare me for pleasure, even happiness maybe.
At least, I tell you so now, to excuse the lousy choices I've made all my life; including you. And to explain why I am such an asshole now,
I guess I just over-adapted to this world.
AND ADMIRE HER
There is a groundbreaking sadness in you.
It slowly walks out of the back door of your thoughts and assaults your projections of happiness, that threatens me every day we wake up. I just can’t look into the assumed perfectness of your life without crying, I am attracted to you because I admire it - but I can’t actually deal with it, or with you.
But when you saw him die and the sadness broke out of you, I finally became a person in our relationship. No, actually, that was the relationship’s beginning. Everything before that was me drooling over what you tossed at me, because it was your left overs.
Your emotional leftovers mainly.
I can’t deal with your sadness. It breaks me.
I wonder through the yellow-lit, black-shadow, gray-water streets to the place I used to live. I still pay rent for it, but I never make it there, everyday I want to wake up there, but in the evenings my miserableness pushes me towards you, trying to feel something in the fire I make of your emotional leftovers.
Now that you are depressed, I can’t burn anything. I step into a pond, a mixture of piss, rain and drugs, right in front of my door. I’d been gone for so long, I had forgotten about how terrible the area was, the house, the people in there, my minimalist room, that was supposed to air confidence and conscious living, but the mattress on the broken tiles in front of the dirty wall with nothing on it but dirt and dead spiders just emphasized my mediocre self-pity. There wasn’t anything heroic about being unable to fashion a room.
I turn the key, which to my surprise I found in my pocket and happens to still work and stand in the hallway, a guy I’d been living with for 8 months comes out of the shower and talks to me for the first time: “Man, could you move out, like no bad vibes, but some friends are coming into town and the don’t have a place and we want to record together, so you know.” “Yeah, I’m not much of a musician.” I reply and walk by him. He probably starred after me, but I manage to reach my room without talking to anybody else.
Light floods in through the door. I hadn’t noticed the sun rise. When had it come back to life?
I hadn’t noticed, that the sun ever shines into my window, apparently I had never been at home, awake and conscious just after sunrise. Six years of lamenting about living in a dark alley and the sun just pours into my room on a January morning.
A strong urge to call her passes me. “I just wanted to let you know that we won’t see again. I found my old life and it’s okay. It’s actually shit but still better than with you. I hope we can stay friends.”
I put my 3 shirts, the spare trouser and the little underwear into the half-molded suitcase which I brought with me when first entering the city; put the mattress over my head and walked out.
The guy was still standing in the same spot with the towel around his waist. “Can you fake my signatures for moving out? There’s half a tomato left in the fridge, you can eat that.” And I walked past him. “Is she beautiful?” he asks, how the hell did he know what I was doing? “No, but the world is ugly” “Thanks” he said and I left.
Back on the streets crowds poured in and out of metro-stations, I walked the known path back to her apartment. She stood in the kitchen, making coffee, wearing nothing but a veil.
“I was sad and I have learned enough from it. The book I started writing after you left will be published in June.” I put the mattress in the middle of the kitchen, sat cross-legged and admired her.
NATURE / EPIPHANIES
Literally every time I spent a couple of days in nature I have an epiphany about how happy I could be living in the country side. Usually I get back to the city and its speed before my insane fantasies of idyll could trick me into trying and seeing how horrible it would be in reality.
THE PROMISE OF THE FUTURE
I just found myself looking forward to the day when I take a tour through the NSA facilities, guided by a survivor witness.
WHAT I ACTUALLY WANTED
“And then, when I get home and can just undress and lay on my bed, still completely high, and maybe masturbate – you know, that's like actually the best part of the party. Like, not always, but quite often I find myself lying there saying to myself “Yeah, this is what I actually wanted”, you know?” “So yeah, great, you have found a way to present solitariness as a pleasure, that's quite advanced when everybody is optimizing themselves in fearing to miss out. But then, why did you even come here?”
Why was I talking to this guy again? He seemed to also have forgotten his reasons to talk to me, so he just bend over and snorted whatever he had been arranging during our four-hour long conversation while I took my coat to leave. I wanted to say something like “Because I need to feel shitty at the party. Because I have talked to people like you.” but even he would figure that it was a lie. After all, yes, I felt great at home because I had felt like shit at the party before. But not because it was a bad party – my carefully created artistic network of friends wouldn't –; but I never managed to cover the bitterness of my social anxiety, the over-thought conversations with unconscious guys I kept crushing on and general disappointment of never getting “the feeling” at its maximum. To cover the bitterness of that cocktail with drugs.
It was raining outside as I walked home. My shoes soaked through. I was slowly calming down. What a nice night.
DRUGS
All the people I admire do more drugs than me. I sometimes feel ashamed for consuming less elicit substances than socially acceptable, like, how am I ever supposed to make a career with these habits?
LOOKING ABSENT / HAPPY SMILEY
A girl looking absent within a group of friends in a club. I'd like to hit on her, but she gets out her phone and sends someone a happy smiley.
EGOCENTRICITY
This article is called “10 signs you ...” but I only talk about myself.
HOPE
I know I should be optimistic about the future, as my ancestors and maybe my parents were. When I walk out of the door, I should be amazed by the technology, astonished by the wealth, breath taken of the beauty of everything.
But I always end up reading about illegal mines and child labor on my phone, sometimes I check it for blood stains.
Maybe the world will be a better place once all crime is organized by robots, committed against robots, enslaved by robots; and we remain as the phony country that doesn't speak up in foreign politics because of economical interest.
2015
Digital photography (no post-production)
2015
Audio: Julius Windisch
Original description (German):
Dies ist die erste Iteration, der Ton wird also in der zweiten derselbe sein mit anderem Bild.
Skizzen ohne Hand
Landespreis für Literatur und Sprache Baden-Württemberg 2012Der Text spielt in einer fiktionalen Welt in der es keine Behinderungen, insbesondere im Bezug auf nicht- oder andersartig-ausgebildete Extermitäten gibt.
Er nimmt in keiner Weise Bezug auf Behinderungen in unserer Welt.
Ich gebe mich frierend, entschuldige meine Hände in der Tasche, die sie früher kratzte und erkläre ihm nichts Sinnvolles. Aber der Höfliche erübrigt das in Höflichkeit, wie höflich von ihm. Er hat elegante Hände, wie es die Höflichen pflegen sie zu pflegen. Ein Ring, keine Ehe3
Da steht ein Lindenbaum: 4
Wilhelm Müller: Am Brunnen vor dem Tore, 1822
[Zeilen 1 – 2, 3 – 4, 7 – 8, 9 – 10] (Im Folgenden nicht einzeln gekennzeichnet)
"Ihnen" - Persönliches Fürwort, Dativ, 2. Person in Höflichkeitsform
So manchen süßen Traum.
Vorbei in tiefer Nacht,
Die Augen zugemacht.
Später fragte mich der Wirt, um etwas sagen zu müssen8
Salvador Dalí: Traum, verursacht durch den Flug einer Biene um einen Granatapfel, eine Sekunde vor dem Aufwachen (1944)
Travel Log
#leipzig-hgb-erster-schultagTravel Log
#trans-euro-bici-2016Offenburg
Offenburg
Offenburg — Basel — Lisboa
Lisboa
Lisboa
Lisboa — Montijo — Montemor-o-novo
Montemor-o-novo — Portalegre
Portalegre — Valencia de Alcántara
Valencia de Alcántara — Cáceres
Cáceres — A-5 close to Jaraicejo
A-5 close to Jaraicejo — Navalmoral de la Mata — Pozuelo de Alarcón
Pozuelo de Alarcón — Madrid
Madrid
Madrid
Madrid — Barcelona — ███████
███████
███████
███████ — Barcelona
Barcelona
Barcelona
Barcelona
Barcelona — Vic
Vic — Fornells de la Selva
Fornells de la Selva — Girona — Llançà
Llançà — Massis de l'Albera
Massis de l'Albera — Port Leucate
Port Leucate — Vias
Vias — Castries
Castries — Uzès
Uzès — Donzère — Valence
Valence — Lyon — Drumettaz-Clarafond
Drummetaz-Clarafond — Chêne-Bourg
Chêne-Bourg — Yverdon-les-Bains
Yverdon-les-Bains — Bern
Bern — Ötlingen
Ötlingen — Offenburg
Offenburg