Despite its retro appeal, running a Windows XP installation full of security holes isn't something I'm in to, so I tried installing Windows 10. On this beast with 1GB of RAM and a "power efficient" 32-bit Atom CPU, Windows 10 suprisingly works but of course struggles. There's just not enough RAM for all that junk running in the background. The quiet days of Windows XP are well behind us.
This isn't an uncommon experience. As software gets increasingly resource hungry, a ton of lower-end hardware is being left behind. These are typically work and school laptops bought in the hundreds or thousands by large institutions to run bloated Windows installations with multiple layers of security and monitoring software. It's not a great experience, especially for school children who might just be put off from desktop platforms by the experience.
Over time, these machines often end up in landfill or sold as "refurbished" on the used market. Even after the bloat is removed, running a modern web browser on Windows 10 and above is a bit much for a machine with only 4GB of RAM let alone 1GB.
So what's the solution? Linux of course! While a lot of popular distributions are dropping 32-bit CPUs, for good reason, a few still support it. And when it comes to limited RAM there's still plenty of options. I've tried antiX, MX Linux, Lubuntu, and some others.
In the end, I went with antiX. While it doesn't have as many features as MX Linux, it still supports most Debian packages and uses less than 300MB of RAM while being fast and responsive.
Since the Samsung NC10 is really old, some stuff doesn't work right. I'm not sure why, but antiX had a few issues while MX Linux worked fine out of the box. On top of that, antiX has a bunch of features that seem a bit useless and can be removed. So here's a few things to do.
As always, after installing a new operating system, it's important to install all available updates. This will also flag any problematic packages and configuration. For example, "rtl8821cu-dkms" fails to build on 32-bit devices. Since the NC10 doesn't use it, I removed it.
sudo apt remove rtl8821cu-dkms
Other hardware-specific packages that aren't used and fail to compile can be removed too.
To disable Conky (the desktop widget) properly, open ~/.desktop-session/desktop-session.conf
, find LOAD_CONKY
and set it to false
.
LOAD_CONKY="false"
Disabling it through the menu tools is not permanent and uninstalling Conky won't remove antiX's attempts at starting it.
We can disable taskbar widgets from the "IceWM Control Centre" or from "Settings > Preferences > TaskBar".
By default, when no network is connected at startup, the connman network window is launched. This is because the connman system tray is disabled by default in favour of the network monitor widget.
If you expect to not always have a network connection, it's best to disable the pop-up window behaviour and enable the tray icon in ~/.desktop-session/startup
.
Here's the best combination I've found using what's available by default:
Use a theme with a dark toolbar as some icons are light and aren't visible against a light toolbar.
Unfortunately, "Clearlooks" is the only IceWM theme with a matching widget theme. However, it uses a light toolbar.
Make sure to use an antiX labelled theme so that is has all necessary icons. For example, do not use Adwaita.
Other than a solid colour or personal image, the "Space.jpg" default wallpaper goes well with a dark and blue theme.
Note, slimski (the login manager), doesn't support solid colours so we'll have to create a plain image of a colour and use that instead.
Sometimes the volume icon fails to appear in the taskbar on startup. In ~/.desktop-session/startup
there is a sleep before volumeicon
is run. If pipewire isn't ready by then, the volume icon will fail to appear.
To bring back the volumeicon for the current session, run in a terminal:
volumeicon &; disown
For a more permanent fix, we can increase the sleep period. However, that adds more delay. Instead, we can keep retrying until it works:
until volumeicon; do sleep 5; done &
If the touchpad isn't behaving as expected, take a look at its parameters with synclient
. The definition of each parameter is under man synaptics
.
In my case with the NC10, the left side was triggering vertical scroll. I first assumed this was because VertEdgeScroll
was enabled. Even though it should only work on the right edge, the default edge values might have caused the scroll region to be exposed on the left side too.
So I disabled VertEdgeScroll
to increase the usable area for cursor movement since VertTwoFingerScroll
works fine.
synclient VertEdgeScroll=0
However, the left edge still scrolled vertically. So I reduced LeftEdge
which seems to have fixed it.
synclient LeftEdge=1000
To persist these changes between reboots they can be placed in x.org's global config directory, user-specific config files or run as startup commands. For example, here's what my /etc/X11/xorg.conf.d/synaptics-fixed.conf
looks like:
Section "InputClass"
Identifier "TouchpadFixed"
MatchIsTouchpad "yes"
Driver "synaptics"
Option "VertEdgeScroll" "0"
Option "LeftEdge" "1000"
EndSection
It's named intentionally so that it applies these parameters after the existing /etc/X11/xorg.conf.d/synaptics.conf
.
~/.desktop-session
contains most of the additional startup tasks and configuration like network, audio, desktop.
Most lightweight Linux distributions ship with Firefox. However, Firefox seems to struggle on older hardware and doesn't get a chance to shine. Chromium-based browsers, are generally lighter, both in CPU and RAM usage. Falkon seems to use the least resources so I strongly recommend using it instead.
For numbers, here's RAM usage on Wikipedia's English homepage on the NC10:
I tried other browsers like "surf" and "GNOME Web (Epiphany)". They were all around Chromium's numbers. I didn't bother with less compatible, old-school browsers. In general, due to modern web standards, most "minimal" browsers are minimal on UI, not hardware resources.
Chromium-based browsers also used 10% to 20% less CPU for video playback compared to Firefox. Scrolling on Wikipedia was a lot smoother too.
Opening tabs and settings was slow on both Firefox and Chromium, while Falkon was quick and responsive. This is likely because Falkon only uses Chromium's core via Qt's interfaces.
Falkon comes with an ad-block extension and supports vertical tabs. It's missing some common features, like a HTTPS-only mode, but generally they're things that can be done without.
Falkon's new tab "Speed dial" page is oddly slow so I recommend always defaulting new tabs and windows to a blank page.
If the display has a low resolution, I suggest reducing the default page zoom to 80% to fit more content.
On older hardware, it's best to prefer websites that don't require JavaScript. For ones that do, look for alternatives. For example, DuckDuckGo has an HTML-only version and an even lighter version. YouTube has many alternative front-ends for video discovery, and native video players which can open YouTube links.
Historically, the battery on a lot of NC10s, including the one I have, failed within a few years of purchase. I strongly suspect Samsung botched the SMART controller boards on most of them as the cells seem to be fine. They charged, and the laptop ran for hours while on "0%". The controller just sends the wrong information. That's a huge amount of waste. Sadly, avoidable battery waste is still a common occurence.
I looked into potential ways to repair it, but given its age and the risks involved, I dropped the idea. Buying a new battery was also out of the question; they're expensive and will probably fail soon anyway. On the bright side, the laptop's a lot lighter without a battery even if I need to carry the power adapter around.
At this point, I had a fully working Samsung NC10 running up-to-date software. To make it run a bit better, there are some hardware improvements we can make.
The NC10 supports a maximum of 2GB of DDR2 RAM, so I swapped in a 2GB DDR2 stick. It greatly improved multi-tasking capabilities and reduced the risk of thrashing.
The NC10 uses a standard SATA connection so I swapped out HDD with a spare SATA SSD. As expected, things were noticeably faster. Though, that could just be a placebo given how weak all of the other hardware is. The main thing is that it no longer makes constant noise even when idling. Of course, the CPU fan still kicks in now and then.
Finally, the NC10 uses Mini PCIe for its old WiFi card, so I swapped that out with a used WiFi 5 card. Like the SSD change, throughput is limited by everything else, but it's nice to be able to use an existing network without any external changes.
I now have a Samsung NC10 that's as good as it can be. As someone that often uses close to the latest technology both in hardware and software, doing these sorts of low-end projects is a useful reminder to appreciate hardware that is easily upgradable, and software that is built to be efficient and compatible. The NC10 with its dodgy battery and Atom processor isn't the best example of this but it's at least been fun and nostalgic.
Thanks for reading.
]]>I'm guessing it's a somewhat niche and new market, so popular brands are really expensive. I decided to buy a cheaper one, a "FENVI AX5400 WiFi 6E USB Adapter" to be precise. Out of the bunch, it was the newest, looked nice and was apparently really fast. 2.4Gbps over WiFi 6 at 5GHz! I'll test those unlikely speeds in the future, right now I just want to make sure it works -- on Linux of course.
Before we start, remember that Linux installations vary wildly, so I can't guarantee a solution. You might even break something. So do this at your own risk!
Initially without the correct drivers and firmware, the device won't do much. It won't show up as a network WiFi device or anything.
To confirm it's a driver issue, if we list USB devices, the device should be listed:
lsusb
The issue is that since the chipset is pretty new, there's a good chance the Linux kernel we're using doesn't have the necessary drivers baked in. To find out which chip the device is using, check the manual or product description. I had a "RTL8832CU".
It's possible to install additional drivers to the kernal using DKMS. Search the package manager for the chip's name and it might already be there. If it is, problem solved! In my case it wasn't so I had to do some additional research.
Linux's community of developers work hard to provide the latest drivers for new hardware. While they might not be easily available, it's possible to find them through forums and looking into alternative package managers to see where they get their sources from.
In my case, I found an AUR package by aquilarubra which uses a git repository by lwfinger for its sources and an additional dkms.conf
file. A quick look through the repo and a make
later, this looked like a robust solution.
First let's make sure our changes will have a visible effect. The list of DKMS modules should not initially include "rtl8852cu".
dkms status
First we'll clone the source repo.
git clone https://github.com/lwfinger/rtw8852cu
Then we can add a dkms.conf
to its root directory similar to the AUR package.
PACKAGE_NAME="rtl8852cu"
PACKAGE_VERSION="1.0.0"
MAKE="'make' -j$(nproc) KVER=${kernelver} KSRC=/lib/modules/${kernelver}/build"
CLEAN="make clean"
BUILT_MODULE_NAME[0]="8852cu"
DEST_MODULE_LOCATION[0]="/kernel/drivers/net/wireless"
AUTOINSTALL="yes"
Note, while the package is named "rtl8852cu" and the repo is named "rtw8852cu", it works with other chipsets in the same series. For consistency, we'll keep the same name as the AUR package.
dkms add /home/main/repos/rtw8852cu
DKMS will make a copy of the repo with the appropriate name and set everything up. Changing the original repo will not change the repo added to DKMS, since it's a full copy, not a symbolic link or anything like that.
The module should now appear in the module list as "rtl8852cu"
dkms status
dkms autoinstall
To make sure everything is set up and pesisted correctly, restart the machine.
Now when we plug in the USB, it should automatically show up as a network WiFi device!
If it doesn't work, you may need to look further into your Linux distribution's defaults. For example, you may need additional firmware packages installed such as "non-free" or "ahs" variants. If you can't find them, you may need to add the relevant package repositories.
It's obvious now that when hunting for these sorts of devices, it's best to check what's in them to ensure a level of quality and compatibility. Had I known "RTL" was referencing "Realtek", a company known for its poor drivers, I would've stayed well away. To help navigate this area, morrownr has created a handy guide.
As always, I'm really impressed with Linux's hardware support. Using the latest non-free hardware always comes with a risk of incompatibility. In this case, that risk paid off thanks to Linux's thriving community.
Thanks for reading.
]]>The most obvious solution is to buy a USB video and audio capture device. They're very common and cheap, usually selling for around US$3.50. Sure they only take composite video or S-Video rather than RGB or component, but that's good enough.
One of the many things I dislike about Windows is that it installs random proprietary junk whenever I plug in a USB device. And if it doesn't, I'll need to install that junk manually. Going by user reviews, as usual the drivers seem to be buggy, incompatible and full of security issues. So instead, I'm going to use a Linux test machine. By using a test machine, I can minimise any potential damage from a random cheap USB device. Linux has video4linux2 (v4l2), which is a collection of open source drivers and other software for video capture devices.
To avoid any ambiguity, I'll start by sharing a VLC configuration without my personal preferences. This should give us an idea of the sorts of information we need to get this working.
/dev/video2
hw:1,0
Undefined
(default):alsa-samplerate=96000
Converting this to a command line gives:
vlc v4l2:///dev/video2 --input-slave=alsa://hw1,0 --alsa-samplerate=96000
We should always first make sure the USB device is working. Otherwise we'll be wasting our time.
To do so, first list all USB devices with:
lsusb
We should see our device in the list. It may be listed with a different name or, in my case, even no name.
If we can't find it, list devices before and after plugging it in and compare.
We can also dump all USB device details for more information and find it that way.
lsusb -v
Once we have identified the device, we can list only its information using its device ID:
lsusb -v -d 534d:0021
This is a useful way to see what video and audio formats are supported along with frequencies, bitrates and resolution.
For reference, my USB device is a "MACROSILICON AV TO USB2.0 20150130".
We can see a list of video devices provided by the USB device using:
v4l2-ctl --list-devices
This might show other video devices too like for the built-in laptop webcam.
If v4l2-ctl
is not available and can't be installed, we can list all available video devices:
ls -la /dev/video*
To find which devices are coming from our USB device, we can list them before and after plugging it in. For convenience, we can use the symlinks matching our USB device under /dev/v4l/by-id/
or /dev/v4l/by-path/
.
In either case, chances are our USB device will have multiple video devices. To find the correct one, we'll have to trial and error. In my case, the correct one was /dev/video2
.
Each time we plug in the USB device, the video device paths might change. So we'll need to repeat this process to re-identity them or use the syslinks mentioned before.
Other than by checking the USB information, we can find a more basic list of supported formats with ffplay
:
ffplay -f v4l2 -list_formats all /dev/video2
My device has raw video at 480x320 and compressed video using MJPEG up to 720x480.
To access these formats in VLC we can add more options:
:v4l2-chroma=mjpg :v4l2-width=720 :v4l2-height=480
It's worth noting that whenever I use these options, they seem to stick somewhere such that if I removed them, they'd still be active. So for example, I couldn't switch back from mjpg to raw video. Though sometimes, but not always, after a reboot or reconnect it would switch back. I didn't look into what's causing it. It could be VLC, v4l2 or even the USB device, though it's probably v4l2.
We can see if the audio device is detected by ALSA with:
arecord -l
or
cat /proc/asound/cards
Typically, if our device is second on the list, its identifier for ALSA-related commands and configurations will be hw:1,0
. If not, again, trial and error.
The actual sound device path can be found under /dev/snd
. For convenience we can use the symlinks matching our USB device under /dev/snd/by-id/
or /dev/snd/by-path/
. However, VLC doesn't support supplying sound devices to v4l2 due to security issues. So, we won't be using these paths. We'll use the ALSA identifier instead.
There are multiple applications for playing and recording the video and audio captured by v4l2 devices. The most common are:
VLC is best for getting started as it has a GUI. Though, that GUI is extremely limited and confusing so the GUI-less cvlc
is useful too.
ffmpeg is useful for troubleshooting VLC-specific issues and extracting more information.
mplayer's not that useful if we're using the other two.
Once everything is figured out, it's best to use ffmpeg for recording since it avoids the bloat of VLC; which probably uses ffmpeg underneath anyway.
At this pointm we can provide VLC with the information we've gathered so far through its GUI. I recommend launching VLC from a terminal so that its logs are easily accessible.
/dev/video2
hw:1,0
When we try running this, chances are there'll be some problems. We'll get into those next.
A black or blank video usually means there is no video data coming into the device. Check the cables and connectors.
Make sure the connectors are colour matched and that they are all the way in. Some connectors, especially as they age, can get a bit stiff.
If we're using a SCART output, make sure the SCART connector going into the output device has output pins. Some SCART connectors, typically adapters, only have input pins that are expected to only be plugged into a display.
A misaligned, warped, glitchy, static output without the correct colours likely means our output is using a different video standard from what the input expects. For example, 60Hz NTSC instead of 50Hz PAL.
Change either the video standard in VLC or video settings on the device so that they both match.
v4l2 has the usual options to adjust the image, such as brightness and contrast. VLC also has those options under "Tools > Effects and Filters". VLC's options are a lot more convenient as it provides a live preview. Unfortunately, VLC's GUI uses tiny sliders to adjust values, with no numbers or number inputs visible.
I suggest using a test image to tweak the values, similar to calibrating a TV or any other display. Use arrow keys to have proper control over the sliders
In my case, the device's brightness was way too high. Dialing it down by around 10 steps and balancing out the other values greatly improved things.
For the USB device I had, I couldn't figure out how to get VLC or ffmpeg to support NTSC (480i@60hz), so I always configured the output device to use PAL (576i@50Hz) instead, which works as default. Changing VLC's "video standard" option in general didn't seem to have any effect. Trying to identify available standards also errored:
ffplay -f v4l2 -list_standards all /dev/video2
# /dev/video2: Immediate exit requested
I assume this is a driver limitation as screenshots of the proprietry software does show options to change video standards. This wasn't a major issue for me as being in the United Kingdom, most media and devices were targeting PAL anyway.
VLC also has a v4l2 tab under "Effects and Filters" but, other than a reset button, it was empty.
Chances are when using VLC with basic options, we'll run into an error with the audio device:
access_alsa demux error: cannot restrict rate to 192000 Hz or less: Invalid argument
main input error: Your input can't be opened
main input error: VLC is unable to open the URL 'alsa://hw:1,0'. Check the log for details.
This is because the default sample rate is 41000
. We can find our device's actual sample rate with:
lsusb -v | grep 'tSamFreq'
This should show something like:
tSamFreq[ 0] 96000
If there are multiple results, try filtering by the USB device as shown previously.
In VLC, we can provide the correct frequency by adding an additional option under "Edit options".
Since VLC uses :input-slave
to set the audio device, we'll need to add:
:alsa-samplerate=96000
After all of these fixes, the end result should now be as best as it can be. We can't expect a perfect solution from such a cheap device. But thanks to Linux and open source, it's a reliable one.
Thanks for reading.
]]>Here's a crude diagram showing such a case. Lines are the case's walls. Fs are front intake fans. Bs are back exhaust fans. A typical layout.
________
|B |F|
| |F|
|______|F|
The problem with these panels is that they prevent air from being pulled in by the front fans. Fans need clearance immediately in front of them. So what happens instead? Since there's not enough air coming in from the front, the air will instead come in from the inside of the case (the back) and create a vortex of stagnant hot air.
The same issue arises with mesh panels and dust filters. If the holes aren't large or numerous enough, or if they're continuously getting clogged, it will block air flow to a noticeable extent.
Cases might provide vents on the side to allow air in, but it's never enough. Fans aren't built to take air in from the sides.
Of course, knowing this doesn't help after the fact. Selling the case now just kicks the can down the road and it's a huge waste to dispose and replace the whole thing.
I've helped first-time buyers with this issue a few times since they use popular budget cases like the CiT Flash. The solution is always the same: move the fans to the inside of the case.
________
|B F| |
| F| |
|_____F|_|
Usually front fans are installed on the outside of the case. A fan is around 2cm or more thick. So by moving it to the inside using the same screw holes, we get those few centimetres as cleareance before hitting the front panel. This gives the fans room to pull air in and create a vacuum of sorts in front of them. That should provide enough pressure to naturally pull more air in from those side vents.
If the fans are already inside the case, then add some gromets. The main thing is to introduce a few centimetres of clearance between the fans and the front panel.
Thanks for reading.
]]>For example, for a graph there's a graph showing used and free RAM as stacked series to represent total RAM. It would be useful to know the value of available and total RAM at any given point in time. We can't include those additional series on the same axis since it would stack with the others and not make sense. Putting them on the right axis would clutter the graph. They could be added as separate panels, but then they'd take up more dashboard space and we'd need to look for them every time we needed to know the numbers.
Ideally we'd be able to see those numbers only when we hovered over a point in time on the graph. It's space effecient and convenient.
Doing that is a bit more complicated than it looks and requires knowing about multiple Grafana features and working around some buggy behaviour.
To be able to assign additional series to a separate axis and hide them, we'll need to apply "Overrides" from the sidebar. This lets us add series-specific configuration. Since we're matching multiple series, we'll need to use the regex option, generally it's the most flexible option. For our regex, we can use a /(Total|Available)/
pattern. Where "Total" and "Available" are the name of our series.
For our overrides, we'll need to move our series to their own axis so that they don't mess with the visible axis. Grafana provides three options under "Axis > Placement": "Left", "Right" and "Hidden". We want "Hidden" as we don't want the axis to be visible.
Note, the hidden option does not hide the series, it's still visible. The axis itself is hidden. It's a third axis, so the series is not using the right axis or left axis.
The next step is to hide the series. We can do this by setting the "Graph styles > Line width" override to "0". Depending on your panel configuration, you may also need to set "Fill opacity" to "0", "Show points" to "Never", among other things.
When we hover over the graph, we can still see a point representing the series' value at that point in time. This is confusing as the point is not aligned to the left or right axis. Ideally it should be hidden too.
Here's the not-so intuitive part of this process. To completely hide the series point when we hover over the graph, we need to override "Standard options > Max" to a number below any possible value of the hidden series, essentially putting the series out of bounds so that it's never visible. This is usually "-1" if your series is always positive. Otherwise, try "-9007199254740991" (the lowest safe integer). You can alternatively use "Min" instead if that works for your series.
You might be thinking that since the series is now out of bounds, we don't need to do the step in "Hidden line". However, not doing so will trigger a bug in Grafana. Your graph will take 10 seconds to load and any time you resize, configure or do literally anything with the graph you'll need to wait another 10 seconds. During which time your web browser will freeze, at least on Firefox. I think it's because Grafana does a ton of calculations thinking it needs to render the series, and since it's out of bounds, those calculations go haywire. So yeah, don't skip any steps.
Now the series should be completely hidden, even if we hover over it.
For the final step, we'll hide the colour in the legend and tooltip so that people know that the series is hidden.
We can set "Standard options > Color scheme" to "Single color" and choosing the bottom-left colour from the "Colors" tab on the colour picker, or from the "Custom" tab set `rgba(0,0,0,0)``. This makes the colour transparent.
And we're done! We now have a completely hidden series which is only visible in the tooltip when we hover over the graph.
To summarise, this is the configuration we need:
/(Total|Available)/
0
0
-1
if your series is positive, otherwise use whatever's out of bounds.rgba(0,0,0,0)
Thanks for reading.
]]>backdrop-filter
on a component with visible boundaries does not naturally include elements outside its boundaries in its blur calculations.
I had similar issues with FrontierNav where using a regular filter
blur on a background element will make the edges soft by including the background colour in its blur calculations.
In both cases, the solution is the same: overflow the blur and then hide it.
For filter
, that's done on the background element itself:
.Parent {
position: relative;
overflow: hidden;
}
.Background {
position: absolute;
top: -20px;
bottom: -20px;
left: -20px;
right: -20px;
filter: blur(20px);
/* ... the rest */
}
For backdrop-filter
, that can be done with a pseudo-element:
.Element {
position: relative;
overflow: hidden;
}
.Element::before {
content: "";
position: absolute;
top: -20px;
bottom: -20px;
left: -20px;
right: -20px;
backdrop-filter: blur(20px);
}
It would be nice if CSS had a way to avoid this hack. But for now, it's simple enough to work around.
Thanks for reading.
]]>Previously, FrontierNav's mobile layout's sidebar overlapped the current page. This caused the content behind it to be inaccessible. It also required multiple clicks to first open the sidebar and then to expand it to view its content.
Now the sidebar appears below the page and can be scrolled to naturally. You can also click the top of the sidebar to auto-scroll it to the top of the page.
This change also allows support for opening multiple sidebars, something the previous layout made difficult. Sidebars are stacked vertically top-to-bottom on mobile.
On desktop, multiple sidebars used to open in floating windows. This had similar issues as the previous mobile layout; covering the content behind it. So now sidebars on desktop stack horizontally left-to-right.
In the future, this new layout will allow for multiple maps and other visualisations to be open at the same time. This is of course already possible by opening multiple web browser tabs, but browser tabs don't share the same state when editing data and it's tedious on mobile, especially in standalone app mode.
There are other improvements to be made too, such as reordering and resizing sidebars, maintaining sidebars on reloads and being able to jump between many sidebars quickly.
Check out the current plans for 2023. To help drive FrontierNav, don't forget to share your feedback and suggestions.
Thanks for reading.
]]>FrontierNav is first and foremost about organising and exploring structured game data. However there's benefits to both structured and unstructured data, like how Wikidata supplements Wikipedia. For FrontierNav, I'm hoping to make that combination more streamlined and easier to navigate.
To start, wiki pages are pretty simple, you can find them under the "Overview" section on any entity page. They are essentially text attachments and function similar to image attachments. Like images, text files can be uploaded from your device, but to make creating and editing text easier, there's also a simple built-in text editor and preview. For consistency, text is formatted using Markdown like everything else on FrontierNav.
Wiki pages will be fleshed out further as more data is introduced and needs arise. It's possible that, as FrontierNav's data format matures, wiki pages will become central to how data is formatted and contributed to FrontierNav, similar to other wiki platforms.
Check out the current plans for 2023. To help drive FrontierNav, don't forget to share your feedback and suggestions.
Thanks for reading.
]]>The change request process on FrontierNav should now be intuitive enough that anyone can take ownership of specific wikis without me being involved. So, if you're interested, let me know. There's some work remaining around giving people admin permissions, but I'll prioritise that when there's interest.
As per the plans for 2023, for the next milestone I'll be introducing wiki articles. The work here was partially started a few years ago so it should be relatively simple. Once that's done, I can finally work on improving FrontierNav's layout system, especially on mobile.
Of course, I'll continue improving existing features like the maps, tables and completion tracking too as needed. So if you have any suggestions, let me know.
Beyond those plans, FrontierNav is at a stage now where it's so large that I'll need to spend some time formalising workflows and automating more tests. It'll also benefit from more eyes and hands to improve its wide range of features. Which is why I've started organising the project towards becoming open source again, this time with more clear intent. This will also help push forward decentralisation efforts where multiple instances of FrontierNav "sites" can communicate and share information with each other rather than having a single "frontiernav.net" hosting everything.
Check out the current plans for 2023. To help drive FrontierNav, don't forget to share your feedback and suggestions.
Thanks for reading.
]]>!g
away.
However, over the last year or so, DuckDuckGo's search quality has tanked. It's gotten to a point where it doesn't even show relevant Wikipedia articles. Wikipedia should easily be prioritised and break through all the spam. I think it's down to it being reliant on Bing's index. Microsoft was never all in on being a good search engine, it just wants to be a part of every possible market. It doesn't need to create a good product when it can just force its billions of Windows users to use it.
So, I started looking for alternatives. The only notable alternatives to Google and Bing are some paid options (like Kagi) which need accounts and Brave Search.
Brave Search has its own index and the results are pretty good given it's only been around for a year or so. I'd argue it's better than Bing. Wikipedia links appear as they should. It provides Reddit results in a dedicated "Discussions" section, without needing to add a site:reddit.com
filter. It even supports !
operators like DuckDuckGo.
Given all of these positivies, I made it my default. It's been a few months since then and overall, I'd say it's been okay, but not acceptable. There's just too many issues and limitations which make it unreliable as a default choice. To name a few:
Oddly enough, image search is one of the best ways to find more niche websites as spam tends to avoid using relevant images.
Brave requires you to pick Google or Bing for image search. This choice needs to be made every time on new sessions, which gets a bit annoying when using multi-account containers and private windows. I'm not sure if Brave is planning to provide its own image search, but it's a massive hole for a default provider, even if 50% of the time I end up having to use Google anyway.
Making queries too fast triggers a captcha page sometimes. This has never happened to me on other search engines.
There are times when Brave seems to get stuck loading. Sometimes it resolves, sometimes it doesn't. Some days are worse than others.
With Brave being all in on cryptocurrency, it's not a surprise that they're into AI too. They have an annoying "AI summarizer" which takes up a good chunk of the results page. Since it doesn't always appear, it makes search unpredictable. Almost like an ad. There's no way to disable it other than by using an ad blocker to pick out and remove the DOM element.
This isn't much different from how DuckDuckGo annoyingly inserts shopping, news and map embeds at the top, causing layout shifts and missed clicks; I had to use ad block for those too. Brave embeds those sections in between results, which is still annoying but not at bad as having them right at the top.
It's always great to see new search indexes being brought in to challenge the current state of web search. However, it takes a lot of resources, and with Google being the default, it's difficult to see how they can be sustainable in the long run.
I'm back to using DuckDuckGo as my default now. I'll be keeping an eye on Brave Search but I'm not too optimistic given Brave's focus and the current state of the public web.
Thanks for reading.
]]>