A fancy world of magic, unicorns and a raging lazy dev

Rant - Lets talk about journalctl ....
by X39


I am currently running a dual-boot system with Windows and NixOS ... The NixOS part however, keeps on getting a kernel panic every other minute and diagnosing it until now is the hell at best.

Actual Rant

Logs are nice. They are the effective go-to choice for pretty much any tool that emits some behavior that might or might not be either complex (what happened logs), error prune (whats the cause log) or simply tell you who connected and who did not.

With this settled, lets decide who logs are for? Well, logs are for you, the stupid human, who does not understand what your PC, your Tool or whatever is doing. All fine and gold still here.

But what human scum, what 20/10 idiot, what duchebag decided to make logs a binary format and that only, forcing literally everybody to use that frking "format" that everybody settled ages ago: TEXT

No, lets make everything complicated to read by telling everyone how "easy" and "nice" it now is ...

1. Making something binary is not making it simpler

That is simply a fact. If i present you a huffman encoded version of moby dick with a fancy command line tool to read it from, you would be pretty disappointed right? But if i gave you a txt file with literally THE SAME contents, that you can open on ANY OS using ANY FUCKING TEXT EDITOR, you would be quite fine right?

So why then even CHANGE the fucking thing in the first place?

To get extra info into it? Use XML. Human-Readable and, this probably will blow your mind, you can add extra info to every method.

Was the idea to preserve error codes? News everyone: { "[ERROR]", "[INFO]", "[IDIOT]" } is something one can write in any text file. Bonus points: If well-formed, this is even a valid way to add extra info, so XML not even needs to be chosen either.

To just be different? Well done, that is one cookie you get. You are different to the annoiance.

2. Using the one tool to rule them all is a horrible decision

Lets assume, i only have a broken linux installation at hand, the default LIVE-CD one gets from the interwebs is also borked to hell and crashes immediatly. Now you wanna know what component causes the crash. You only have your handy-dandy android phone at hands and export the whole error logs to it. Now what will you see? GARBAGE is what you will see. An utterly useless binarized format just for the sake of having one

3. The solution to keeping the tool

I know, the following will be hard to swallow for anybody ... but parsing text files is trivial. It takes longer then reading binary formats, yes, but you are not reading the logs for the lulz on a regular basis. You read them because something happened and you want to diagnose it. So lets say, we want to keep the tool that is journalctl but without the idiotic binary file required for it. How would we go onto this?

Simple: Instead of reading the fucking binary format, READ A TEXT FORMAT. It is AS trivial, and has the added benefit of one being able to read it outside of that ugly command line crap. It is readable on ANY OS with ANY text editor and so on. And IF one wants to still filter stuff out, it can be done the same way as before. Sounds crazy, i know. But guess what: IT WORKS.


So ... how to progress from here? Chances are, you are at the same point where you are, being angry at that human trash who came up with the idea (before somebody reaches out to me about how good of a person he/she/the group and what intention they had, i did not read into it and come 100% from an angry users experience in diagnosing failure of his fucking system) to binarize that shit.

The only solution is to choke the pill, somehow get a full export into a text file, which is horrible formatted thanks to the "optimized output" (ohh and before you even attempt it: journalctl > my_readbale_log.txt will not work because probably the same idiots thought "why make it easy to export it to txt. It should be hard to read the log in a users-pleasing way" and don't trust those idiots in the web saying "ohh do it anyways because it should work" ... it does not, believe me, i literally used journalctl -b -1 > ~/journal.log and, surprise surprise, i got an empty log file with the fancy size of 0bytes)

So good luck on diagnosing your system with journalctl. You might even be able to use regex, in your head, to solve all your problems at a glance. I personally will just wipe the Linux partition and avoid any Linux running systemd like it offered me a free buttfuck with a horsecock. Prooves once again, why i still stick to windows ... Gaming works perfectly fine and literally everything else too. And once i have a problem, i just wipe the whole system and be done with it. Linux? Nowadays: Have a kernel panic, not even get logs because kernel panic reasons are not displayed per default anymore (can still remember those fancy days where i got that sweet message of "ohh yeah ... the USB driver for XY crashed unexpected" thanks to some broken device i had attached to it, allowing me to (and stick with me here again) DIAGNOSE the reason easily. It was a blessing and not that mess like with windows where i had to google after the reason and dig in system logs), after enabling them not being able to find them (thanks to some weird configuration quirks that are not set up correctly, as just enabling coredumps is no longer enough), reading through the logs to find the cause not being possible (hey, windows bluescreens are nowadays easier to debug then linux ones) and the internet being full of uninformative "ohh this app crashes" when you try to get anything related to "kernel panic", "system crash" or whatever.

So long story short: Stay away from linux. It is a horrible experience nowadays, unless you have a setup that already works ... and it is a shame that it changed to this mess it is nowadays.

C++ - Fixing "Nested Templates" linker errors
by X39

So today i had to fix something rather ugly, a linker error caused by a missing template specialization.

The problem?

template<typename T>
data to_data(T t);

class value
        data m_data;
        // ...
        template<typename T>
        value(std::shared_ptr<T> d) : m_data(to_data<T>(d)) { }

This fancy snippet here. Looks harmless, two templates, one not implemented one is. All fine and gold right? Well ... only if you always provide the upper somewhere. As it comes, my to_data<T> template implementations are rather specializations like eg.

data to_data<std::string>(std::string t) { ... }

again, all fine up until the point where you got a huge code base and actually used something bad ... i ... missed something ...

// LNK ERROR because the linker tried to resolve the following
template<> data to_data<char*>(char* t);

now, normally you could just throw in the implementation and call it a day. But with char* things are different ... a dangerous thing, because this might be a potential memory leak. So how do you find the cause of this ugly thing?

Well, rather simple! You add a specialization for the "calling" template until you reach the point where you are in a non-template method BUT you do not provide code for it, forcing a linker error.

for me, thhe following changed:

class value
        data m_data;
        // ...
        template<typename T>
        value(std::shared_ptr<T> d) : m_data(to_data<T>(d)) { }

        // Add some PEPPER to fix the linker error
        value(char* d);

Hitting build, this showed me that the actual cause of this linker issue was callextension_string_string which did indeed passed a raw char* into this fancy method. But luckily no memory leak.


  1. Add Template Specialization without code until you reach the place that explains the linker error to you
  2. No more complains

C++ - Working around corners
by X39

So as some of you might know, as of now it is not possible to create conversion operators that are not part of the corresponding class.

This causes ugly situations, where one has one class, and eg. std::string, both not editible by yourself for whatever reason i might add, and needs to add a conversion operator simply because value(std::string) is cumbersome and will not work for all cases (easy transformation using <algorithm> for example).

For all of those, i got some very basic solution here:

    #include <memory>
    #include <string>
    #include <iostream>

    class data
            virtual std::string to_string() const = 0;
    class value
            std::shared_ptr<data> m_data;
            value() : m_data() {}
            value(std::shared_ptr<data> data) : m_data(data) {}
            template<typename T>
            value(T t) : m_data(to_data(t)) {}
            std::shared_ptr<data> get() { return m_data; }

    class d_string : public data
            std::string m_value;
            d_string(std::string s) : m_value(s) {}
            virtual std::string to_string() const override { return m_value; }

    inline std::shared_ptr<data>& to_data(std::shared_ptr<data>& input, std::string str)
        input = std::make_shared<d_string>(str);
        return input;

    int main(int argc, char** argv)
        value val;
        val = std::string("test");

        std::cout << val.get()->to_string() << std::endl;

This fancy "hack" will output test as expected and work on all major compilers :)

It works because whilst conversion operators might not be possible, operator overloading (like the << operator we abuse here) is!

Let's talk VR (again)
by X39

So ... i got myself an Index.

It was 1079€ and in the end, this would have been the nono for me (paying a grand on a VR headset? no thank you!). However, thanks to regular, wasteful spendings in DotA 2 and Counter-Strike: Global Offensive cosmetics, i was able to get the price down to 607€. Still expensive, but kinda okay.

I played with it now for roughly two weeks and ... it is awesome. Surprise surprise! I KNOW ... now to the part, that connects to the previous posting i made: What do my friends think about it?

VR is great!

I usually showed them two games: Pavlov VR and Beatsaber. Both, i am playing the most right now. The result was always the same:

  • ultimate joy
  • happy faces
  • FUN

My sister, being at the police, had way more fun in Pavlov VR (and pretty much hit as good as i did with the rifles but god damned ... pistols? Literally every shot a hit at the spot she wanted without even thinking for a moment) then i expected at first. A friend of mine played beatsaber for half an hour and got so exhausted that he had to sit down afterwards just to grasp some air again. Another friend wanted to play skyrim, just for us to find out that my modding efforts made the game unplayable (after 2 days i fixed it finally ... save file btw. still broken but at least it starts again). But in the end, all of them asked the same question: How much?

How much is the fish?

Telling all of them the price, made that happy laughter on their face go dark in a blink of an eye... and i only told them what the full kit i bought would cost, not what the beefy PC add to that price. I also mentioned briefly the Oculus All-In-One-Inside-Out Quest Headset, that sells for 450€ currently... still too much.

And, this is essentially why: VR is too expensive for what is has to offer. My usual VR sessions are ... what? Half an hour per day? And i dropped a fortune on a headset that most of the time will gather dust and nothing more. Obviously, i also get other usages out of that headset (i yet have to look into the development side of things) so ... i kinda could argue the pricetag. But normal CONSUMERS like they are? 450€ is a lot of money... Fucking hell, one of my friends got his first car for that price!


The average VR Joe now might say "What gives!?" but to you i want to give you these lil hints:

  • Unless you play no MP games, you will need other players.
  • Indie Titles, lets face it, are shit. They do provide fun ... but they lack a lot of polish most of the times.
  • VR will not settle on one control scheme unless it is mandatory for game developers (small studio asks big company? Who cares?).
  • Enthusiasts do not pay rent.

Long story short: VR is to expensive and needs to drop below 200€ in total for most of those i showed the headset to, to even consider it. SDE does not matters here btw. but the cable was a concern for most of them. Setup is also a concern, so inside out tracking for mass market appeal will it be.

Why VR in it's current form, never will get the "huge hit"
by X39

There is one fairly simple reason: It is way to expensive. Especially in the EU.

Just as some example: The HTC Vive Pro Starter kit (not even the full) sits at a whopping 1199€ (1359,28 USD) in the EU and "just" 968,53€ (1098 USD) in the US. Both prices are way too high, preventing the tech to actually settle with the non-fanboy user. And i do mean fanboy here! Most people on the internet "defending" the high price simply say "just save up some money and you will be fine.". Thing is: At theese prices, people buy cars. There is simply no way the average consumer, only slowly even being able to afford a computer beefy enough to even run VR (especially with the alltime high of graphics cars prices) will not throw out a top-tier graphics card (at time of writing, RTC 2080ti sits at 1100€ in the EU) for "just" some piece of tech he barely ever will use.

And here is the problem: VR is "just" a piece of input-device bundled with a special gimmick, Virtual Reality. Now, people defending VR may say "it is not ment for everyone" but this is the thing they fail to realize. With VR not being for everyone, VR will fail. The entry barrier (the beefy PC) already is high enough that most people out there simply cannot overcome it. But even if the entry barrier is conquered, there now is the problem of the absurd price point.

Now, smart users may point out that there is an alternative, the non-pro HTC Vive. And you are correct, kind of... See, the first generation VR can hardly be sold to anybody who wants to play games casually. It is not comfortable (like most Gen1 VR headsets), has a horrible resolution causing SDE and reading text with it is a nightmare kind of.

All this, makes those headsets suited for enthusiasts, but not the average John Doe who just wants to enjoy playing some games in VR.

This is where Gen2 VR failed miserable: The Price should have gone DOWN and the SDE should have been mostly eliminated. Thing is, price doubled. That is bad. It is literally as simple as this.

Want VR to succeed? Put the headsets into a more "sweetspot" like thing, that does not requires people to save their earnings of a whole decade on theese things.

Want VR to fail? Then leave the price way too high, making the headset itself as expensive as the computer you just purchased to use one of those.