@Faceman2K23@discuss.tchncs.de avatar

Faceman2K23

@[email protected]

poop

This profile is from a federated server and may be incomplete. Browse more on the original instance.

Faceman2K23, (edited )
@Faceman2K23@discuss.tchncs.de avatar

One of my mini PC’s is an N95, which is similar to the n100 but with a higher peak power. It’s faster than the old legend 2600k and has a decent little igpu for video processing or general desktop use.

I run a jellyfin test server from it, transcodes high bitrate 4k HDR H265 to 1080p SDR tonemapped H264 at over 200fps, while running my security camera Dashboard with multiple video feeds.

Their only limitation is they usually only have a single memory slot so keep that in mind.

Faceman2K23,
@Faceman2K23@discuss.tchncs.de avatar

I wish TCL would stop referring to it as electronic paper, it’s a matte LCD with some desaturated modes for eye comfort.

for me, the major selling point of a true e-paper display is sunlight readability, if your “electronic paper” LCD cant match e-ink, then it’s not good enough.

The main E-ink patents are due to expire in 2026, so we should see some rapid development after that.

Faceman2K23,
@Faceman2K23@discuss.tchncs.de avatar

I love my Boox Note 3. It’s am older device but still gets updates lots of tweaks for tuning the display on a per app basis, runs Google apps etc. I use it mainly as a reader for books and manga but also for drawing notes and browsing the Web.

Faceman2K23,
@Faceman2K23@discuss.tchncs.de avatar

Transflexive displays can work, but they arent as easy on the eyes as e-paper and they have poor contrast in direct sunlight.

Faceman2K23,
@Faceman2K23@discuss.tchncs.de avatar

Wow that’s a surprisingly big jump in performance for a software based AV1 decoder.

Faceman2K23,
@Faceman2K23@discuss.tchncs.de avatar

Gotta download at least a few actual Linux ISOs to be a real datahoarder.

Faceman2K23,
@Faceman2K23@discuss.tchncs.de avatar

Yea I have 100tb of these weird Linux ISOs, I have no idea how they even got there either.

Which are the best SFW extensions for Tachiyomi?

I know this is not a Christian community, but I recently discovered Tachiyomi looking for alternatives to Saikou and a good alternative to read manga (which I don’t do much, but it makes me curious) is Tachiyomi and although installing extensions is really simple the vast majority have NSFW content (or hentai?) and Jesus, I...

Faceman2K23,
@Faceman2K23@discuss.tchncs.de avatar

If you are into self hosting there is a very good tachiyomi plugin for Komga which is a manga and comics server you can self host and fill with your own content, it also supports connecting to multiple servers.

I have 2 Komga instances running on my home server (one for normal manga and one for the… other kind), content is collected by FMD2 which automatically downloads series I have set to monitor (similar to sonarr but less polished). then in tachiyomi I have access to all of that content streamed across the web to my phone and boox e-reader anywhere in the world. it’s pretty neat but the only missing feature is synchronising read status and position across devices.

Faceman2K23,
@Faceman2K23@discuss.tchncs.de avatar

It’s probably possible and safe for roms, but then there are already more or less complete packs for basically every console and retro computer ever made so it’s not that useful.

Faceman2K23,
@Faceman2K23@discuss.tchncs.de avatar

The only mmo’s I ever really spent any significant time with were FFXI (on dial up for most of the time I spent with it so my experience wasn’t great) and GW2. not mentioning eve… we dont talk about that…

There was a free MMORPG called Planeshift (I played quite a lot in 2005 or so, it’s still around and being actively worked on) that was very elder scrolls inspired, I put a lot of time into in the past too but being a small free amateur project it didn’t have a lot of players.

I just don’t have the time or motivation to give modern MMOs the attention they really need to make progress, and I was very much a solo player at the time so progress was slow and hard.

I think if I was to pick up a new one it would be FF14, it has a balance of open gameplay and story which a lot of mmos ignore, it seems like a good community but as with all mmos they really dont want you to just drop in a play a couple of hours every second weekend, they want you to get in daily and stick to a routine, which I just cant commit to these days.

Faceman2K23,
@Faceman2K23@discuss.tchncs.de avatar

I weep for your phone bill, I used to do that to send emails from my Sony Clie when wifi was still pretty new and rare in public.

Are we old. I think we’re old.

Faceman2K23,
@Faceman2K23@discuss.tchncs.de avatar

You’re in Aus right? How are your speeds?

I currently use both NewsgroupNinja and NewsDemon combined but still cant get past about half a gig a second but it very well may be a limitation in my download server somewhere.

Faceman2K23,
@Faceman2K23@discuss.tchncs.de avatar

I’s not a problem at all, it’s pretty sweet, just getting only about 400-500 mbits when I can get over 900 combined when testing with torrents and other services. ABB say they absolutely, unequivocally do not throttle usenet. So I just suspect it is a limitation of my server setup and I’m too lazy to try to find the cause…

Faceman2K23,
@Faceman2K23@discuss.tchncs.de avatar

Thats a pretty old calculation that doesn’t take into account hardware acceleration, which modern chips can do very, very well.

So you can get away with a celeron class intel chip (or whatever they are calling them these days) and mange multiple 4k to 1080 transcode streams without issue.

I have a mini PC with an Intel n95, which is around 5500 on the passmark chart, but can easily churn out 10+ 1080p to 1080p transcodes if needed, of course the ideal setup is to avoid transcoding wherever possible. that’s not my main server, but it’s nice to know that if I wanted to I could move my plex server onto it and it would be fine.

Faceman2K23,
@Faceman2K23@discuss.tchncs.de avatar

The best way to improve plex/jellyfin performance is to make sure your players support what you are trying to watch and transcoding should only be needed if you are sharing remotely to someone with either slow internet or a crappy old player.

Of course, with hardware encoding the chip in that little computer can do what you are asking, including doing it at 4K if needed.

Faceman2K23,
@Faceman2K23@discuss.tchncs.de avatar

not usually too hard to fine older Norco and similar cases with 16+ drive bays.

I got one on FB Marketplace for less than the cost of a new 10tb drive.

Faceman2K23,
@Faceman2K23@discuss.tchncs.de avatar

A simple, reliable and always pleasing meal.

If you haven’t tried it yet, next time try tossing the broccoli with olive oil and a bit of salt then bake it in the oven until the florettes are brown and crispy. works great with cauliflower too, you can par boil them beforehand too if you prefer the softer texture and it still works great.

Faceman2K23,
@Faceman2K23@discuss.tchncs.de avatar

Now i’m hungry

Faceman2K23,
@Faceman2K23@discuss.tchncs.de avatar

Not sure about a do-all solution for the Manga side of things other than some outdated archive torrents or things hidden in the depths of usenet, but a free program called FMD2 can automate the downloading and CBZ-ifying of manga from hundreds of sources. it can act like a sonarr for manga once set up with series you follow and of course you can bulk download just at a slower rate due to the same limiting.

Faceman2K23,
@Faceman2K23@discuss.tchncs.de avatar

The uhd 630 is pretty good but it is old enough that it might have some codec limitations. I think it was the first generation with full H265 10bit support so your 4k hdr rips can be transcoded if need be.

You should be able to manage 4 or 5 simultaneous 4k to 1080 transcode and at least 10 (likely more depending on bitrates) 1080p transcode streams.

It is still best practice to avoid transcoding wherever possible, but if it is needed it should be quick and seamless on that chip.

Faceman2K23,
@Faceman2K23@discuss.tchncs.de avatar

Digging into some deep speculative fiction here rather than pure fantasy.

An Idea I’ve always I liked that can be extended to include tectonics is deep speculative biology, creating links between plants and animals with ancient evolutionary connections with distant relatives across former continents, within this world extremely slow evolving or slow living creatures could experience time differently to us, to our eyes they might seem completely static like rocks or trees but to them they are a thriving and active community.

From that you can have civilisations that have existed in some form across hundreds of millions, or even billions of years with notable epochs in their long history being related to points in time where different land masses were connected, causing major societal upheaval and rapid evolutionary shifts, but also access to potential new resources and technology.

You could have one group of people on an isolated plate that over hundreds of millions of years contacted other landmasses and either settled them causing new disparate groups to form, or met with existing other groups and shaped their future by passing on some technology or resource.

Imagine an overwhelmingly large planet with uncrossable oceans, how they’re uncrossable could be expanded upon, but imagine that travel across this ocean is completely impossible or impracticable in some way, perhaps flight is also impossible to extreme gravity, high technology may be impossible due to resources. plate collisions are rare due to the immense size of the planet and the small size of the landmasses in comparison so the only way these people have ever come into contact with other civilisations or non native species is through the slow movement of their plates, they have remnants of stories from their deep history, they have long melded into mythology, possibly becoming their creation myths.

So a short prompt for a story or role play of some kind could be:

An outcast scientist has created a piece of equipment to peer into the distance and he sees what appears to be land, it seems to be moving towards them, fast. He tries to tell the ruling priests that a landmass is approaching and will make contact with them within a generation, they scoff as the prophecies say it will be a thousand generations before the next “meeting”.

Faceman2K23,
@Faceman2K23@discuss.tchncs.de avatar

People have tried and failed to make the “one arr to rule them all” but the current stack is pretty lightweight, stable and mature so it is better to just install them all in containers then have some kind of frontend and request system in front of them for users and admins.

I use Organizr as a frontend (keeps them all together in one interface and optionally handles SSO across all of them) then I have Overseerr for users to add media without having to give them access to the arrs directly.

Faceman2K23,
@Faceman2K23@discuss.tchncs.de avatar

I’ve tried to use lidarr but I think my archive is too weird for it.

Not only do I have a lot of obscure releases, but I also have things like vinyl and cd rips of every version of every album by certain artists. Like I have a huge amount of frank zappa for example, sometimes I will have 10 versions of the same album, sometimes more. I have collections of the different live variants of many tracks, archives of guitar solo variations ets…

Lidarr has no idea how to handle that so I do it all manually.

Faceman2K23,
@Faceman2K23@discuss.tchncs.de avatar

While I tend to avoid encoding wherever possible, I use H265 10Bit at low RF to archive non-critical libraries (old TV shows in some users personal libraries, 1080p movies more than 1 year old and over 20gb etc…).

my average size reduction going from a 1080p Bluray remux of 35-40gb is about 50% with no significant effect to image quality. High action or high grain movies end up a bit larger, slower movies with no action and most animations compress a bit smaller. works well overall.

basically any modern device can decode them and the image quality tends to be a bit better than 8bit.

I’d like to go with AV1, but very few of my client devices can decode it, so its not worth the trouble to save a few percent,

Faceman2K23,
@Faceman2K23@discuss.tchncs.de avatar

Sounds like a bug in the encoder perhaps. I dont have that issue with my setup, but I’m not using AMD GPUs

Faceman2K23,
@Faceman2K23@discuss.tchncs.de avatar

Automatically ripping movies is pretty easy, but TV shows often need manual work to get them right.

Sometimes you’ll get individual videos with the correct chapters, runtimes and they are listed in order, but other times they will be jumbled in random order, or will be one large video that needs to be split manually into episodes.

Faceman2K23,
@Faceman2K23@discuss.tchncs.de avatar

If anyone has the means and a proper setup for it, the surround sound mix of this album is a crazy experience.

Faceman2K23,
@Faceman2K23@discuss.tchncs.de avatar

That was such a dumb release, I love it. Perfectly on brand from them.

The actual surround mix itself is fantastic, lots of motion and depth.

Transcoding performance on Intel NUC

I’m taking down my big Supermicro server to save energy and moving Plex/Jellyfin/*arr to a spare 10th gen Intel NUC with SSDs. Performance is fine for DirectPlay media to my SHIELD and mobile devices, but the onboard GPU power is limited and struggles to even transcode some 1080p media – let alone 4K. Does anyone have...

Faceman2K23,
@Faceman2K23@discuss.tchncs.de avatar

something is definitely wrong with the config if its failing at a single 1080p, I did a plex server test on an intel n95 nuc (one of the lowest end cpus in their current range) and it blasted through multiple 4k HDR transcodes simultaneously.

Faceman2K23,
@Faceman2K23@discuss.tchncs.de avatar

It’s fine, it can do a few at once. I didn’t do a lot of testing since I never have to transcode these days anyway.

Faceman2K23,
@Faceman2K23@discuss.tchncs.de avatar

I think it’s worth figuring out why Fileflows wasn’t using your hardware, if you re running in a container you may need to manually map the hardware device for example. you can customise and configure as much as you want too, you can even go as far as custom ffmpeg command line options, and having multiple options based on the flow you write.

However, the CPU encode will provide better image quality in most cases, and since you can set it to run slowly in the background with a couple of CPU cores and limited usage you can just let it run and eventually it will be done.

I run fileflows on my file server and it is using a GPU, but some files have to fall back to CPU (my GPU is a bit older so there are some unsupported files) in which case it gets just a single core and takes its time. it’s saved me several terabytes of space archiving old content in libraries that I consider less than critical. I could have just run a one time ffmpegbatch run, but I like having it checking regularly so that new additions to the library enter the flow, they stay untouched for

Faceman2K23,
@Faceman2K23@discuss.tchncs.de avatar

Jellyfin is great and I follow its development and test it every now an then but it is nowhere near fully featured or well supported enough or me to transfer my family over to.

I will eventually, when it’s ready.

Faceman2K23,
@Faceman2K23@discuss.tchncs.de avatar

Totally different software solutions aimed at different users, and many people use both.

Plex is a Server software that handles media management, libraries, users, etc etc… and a range of player apps that have a somewhat beginner friendly layout requiring little to no setup

Personally, I run a large Plex server that provides content for my family across dozens of mixed devices in home and out of home, different users have access to different libraries and have different preferences. If needed it will automatically transcode content for remote users out of the home to fit my upload bandwidth and their available speed if they are on mobile. it keeps track of watched content and position for all users so they can move between devices seamlessly.

Kodi is an extensible media player frontend, it can play files from a remote server or NAS but there is no server management, it is just doing basic file access. there are addons for many common services and media sources but there is no user management, no transcoding, no sharing content with other clients etc etc. Having multiple kodi installs on multiple players requires each client to be configured more or less from scratch and no easy way to have multiple setups for different users with their own preferences, libraries and/or content restrictions. It is extremely powerful and configurable and has strong format support.

I have Kodi installed on one of my Nvidia Shield Pros but only use it for playback of surround music files (support for 5.1 flac on plex seems to be limited to audio within video containers for some reason) I find the interface (and all the skins I tried) extremely clunky for use as a music player, the way the remote works within the player itself is unintuitive and makes for an annoying experience restarting the track when you just want to move the playback a few seconds, a bit unfair of course as that isn’t what it was made for but that’s just my experience.

Faceman2K23,
@Faceman2K23@discuss.tchncs.de avatar

a remux is just the video and audio data put into a new container, no compression. but it is just the main movie file, no menus, no extras, nothing like that.

the common tag for searching for whole disks (BDMV folders, Video_ts folders or ISO images) is BR-DISK

Faceman2K23,
@Faceman2K23@discuss.tchncs.de avatar

search for the “BR-DISK” tag

Faceman2K23,
@Faceman2K23@discuss.tchncs.de avatar

Another Tdarr alternative with (i think) a more flexible and automatable setup is FileFlows (the free version is all you need) you can build node based rulesets and apply those rules to different libraries.

Faceman2K23,
@Faceman2K23@discuss.tchncs.de avatar

Hate repost bots, if you want to use one as a back-er-up-er-er it should post to a dedicated community intended to be an archive, not the main communities for a topic, they are practically spam and don’t promote any conversation in the comments as people avoid commenting on something that has zero connection to the original poster of the question.

Faceman2K23,
@Faceman2K23@discuss.tchncs.de avatar

I’ve been using Prowlarr and Jacket to do mass searches of public trackers and my Usenet sources for every variation of surround, multichannel, dvd-a, bd-a sacd, dff, dsd, atmos, ac3, dts, 5.1, 7.1, etc etc etc…

I think I have about 70 surround albums so far, excluding concerts that I have a separate collection of.

Faceman2K23,
@Faceman2K23@discuss.tchncs.de avatar

I’m hunting through there at the moment but so far nothing that I don’t already have or have access too.

Faceman2K23,
@Faceman2K23@discuss.tchncs.de avatar

Found a few users with heaps, but a lot of fake amateur algorithm upmixed stuff mixed in with real professional mutlitrack mixes.

Also a lot of Tidal rips, though they are mostly very poor quality upmixes done by engineers who don’t know how to use surround properly.

Faceman2K23,
@Faceman2K23@discuss.tchncs.de avatar

basically any PC with a recent (intel 10th gen and newer quad cores) CPU will work great for any normal media server build. You will just need enough space for your disks and some room to grow, the motherboard, cpu, ram and psu.

since you already have disks with media on them moving to a dedicated NAS OS will be a bit of a pain if you want some form of data protection. I’d definitely allow in the budget for at least 2 new large disks to start with. Personally I went down the unraid path as it allowed the most flexible disk mixing and matching, I could just throw whatever HDDs I had into it and all data was parity protected. it’s not free but it makes for a good home NAS. moving existing data and re-using the disks is a pain as you need to start with enough space to dump a whole disk to, then wipe that disk then add it to the array, then repeat for all of your disks, this can take days but it works and gets your data loaded and parity protected with a minimum number of new disks required.

Freenas, now called Truenas is an excellent option but it will be less flexible in adding disks that arent the same capacity. you cant just buy one HDD and drop it in to expand in the future, you tend to need to plan it out a bit more, but it is extremely fast and very reliable. so it’s free but can cost more in the long run.

If you like to tinker you can just run something like ubuntu and set it all up from scratch, or there is one called Xpenology, which is a clone of the synology software, it is very easy to use and reasonably flexible.

You can just plug the HDDs into the motherboard if it has enough ports, but I’d recommend getting onto eBay and getting yourself a SAS HBA card and sas-sata breakouts, there are sellers that have them as combo kits just for this purpose.

My first couple of server builds used the motherboard ports and the SATA controllers died pretty quickly, then I got a LSI 9211-8i, than added a sas expander for more ports, and more recently a newer 9300-16i card that will do me forever.

Faceman2K23,
@Faceman2K23@discuss.tchncs.de avatar

I have a Boox tablet (an older Note3) for the actual reading, I run readarr as a downloader/manager and use Ubooquity as the server. If you arent a massive nerd I’d probably suggest a kobo reader over an android reader.

I dont tend to “stream” the books from the server, because there is no point, they are tiny files, so i use the ubooquity webui to download the file to the device when needed. though even that is unnecessary as I can just vpn into the server itself and pull the files, or have them all sync automatically when on wifi since it is just an android device so i can run whatever apps I want to do that, I just use ubooquity as I used to use its web ui reader to keep in sync between multiple devices but stopped reading on my phone as I preferred the e-ink display. could also just dump them to a usb-c disk and move them manually.

I might soon replace ubooquity alltogether and just have Readarr put the files into nextcloud or something directly and have that sync with the tablet when on wifi.

The source for the titles themselves is the usual suspects, public trackers, usenet etc.

I’ve used calibre in the past to convert and de-drm books for a kindle I used previously, but I never actually needed any of its other features like re-formatting or editing metadata so I stopped using it as soon as I replaced the kindle with the Boox reader.

Faceman2K23,
@Faceman2K23@discuss.tchncs.de avatar

I do get about 2 weeks out of my Boox tablet usually, but that is with all the radios turned off, no light and using the built in reader app that puts it into a super low power state as opposed to third party reader apps that burn through battery like nothing else.

Faceman2K23,
@Faceman2K23@discuss.tchncs.de avatar

Ubooquity does this if you use the web reader across multiple devices, not sure about any third party apps or e-reader integration though.

Faceman2K23,
@Faceman2K23@discuss.tchncs.de avatar

Yep. I’m 100Tb deep into that rabbit hole.

Faceman2K23,
@Faceman2K23@discuss.tchncs.de avatar

im just waiting for a Mangarr that actually works, currently run FMD in a container.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • random
  • uselessserver093
  • Food
  • aaaaaaacccccccce
  • test
  • CafeMeta
  • testmag
  • MUD
  • RhythmGameZone
  • RSS
  • dabs
  • KamenRider
  • TheResearchGuardian
  • KbinCafe
  • Socialism
  • oklahoma
  • SuperSentai
  • feritale
  • All magazines