Skip Navigation

InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)SE
Posts
17
Comments
327
Joined
1 yr. ago

  • https://substack.evancarroll.com/p/the-erasure-of-luigi-mangione

    even if you disagree with what he did, luigi is someone who was in horrible emotional pain due to an injury and was once someone who did code and contributed to the site. big tech erasing people from history because they can is disgusting and should be called out for what it is. luigi was not always injured, in horrible pain, and accused of a serious crime. just making it seem like he didn't exist is pure sleaze and kowtowing to the elite corporate interests that absolutely hate how certain segments of society like him. it's a great example of why big tech controlled by oligopolies and concentrated power is a horrible way for the public to interact with the internet. whether or not you support or condemn what he did, erasing people (albeit through "private" oligopolies) is similar to what the worst government censors did in very hash regimes.

  • Thank you! There are two different kinds of people in the world. There are those great at coding and those great at watching netflix and doing whip-its.

    edit: actually this torrent downloads ubuntu-22.04.5-desktop-amd64.iso, which isn't the latest LTS version (which is 24.01.1). It's probably good enough, but part of the challenge in this was to always torrent the latest one.

    does anyone know why ubuntu-latest.torrent would try to download 22 instead of 24 LTS?

  • Well when I asked DeepSeek, because you can see it thinking, you sort of see it trying to please the user and deal with the censors and political environment lurking in the background, it's like instead of just thinking it has to sort of do this dance of politeness to not upset the censors. I even cut and pasted information from Wikipedia about it and it was like "no, I need official sources" so I cut and pasted the bibliography part from Wikipedia and it was like "the user is crazy and making up official sources" (jk). It said something else, but it was odd that when even when given overwhelming information, it seemed almost afraid to budge or purposely stupid. It seemed like "official sources" was the slang it used for government-approved information, just like "misinformation" in the US often meant during the Biden administration "not approved by the US administration." (Although misinformation was sometimes applied to false things, people got banned on Facebook for saying the lab leak theory when saying anything other than "it came from a bat" was misinformation and they had said scientists had done genetic tests proving it came from the bat, gain-of-function covid lab next to the market be damned.)

  • you're probably right

    i'm trying to force a tool I have to do something it's not capable of doing. I either have to abandon this or learn more bash scripting. It's probably not something important enough to have a script for, it's just surprising it can't be done because it seems like such an easy task.

  • when i first started linuxing linux, i tried to read man man, but it was over 700 lines

    my primary linux skills are whip-its and watching netflix. i'm sort of at the lower-skill end of the linux userbase. i don't know if there's any way to every change that, so why try?

  • here is deepseek deepsucking, to use a phrase someone else came up with:

    Thought for 2 minutes 13 seconds

    Okay, so I'm trying to write a bash script that can download the latest Ubuntu LTS release from the given torrent site. From what I understand, the user wants the script to automatically find the most recent LTS version and then wget it based on its filename.

    First, looking at the example output from curl, I see that each row has an info hash, a torrent name, and some download stats. The torrent names seem to follow a pattern like "ubuntu-YYYY.YY.YISO desktop-amd64.iso". My goal is to extract the version number from these names.

    I need to figure out how to programmatically identify the latest LTS release. The user mentioned that LTS releases have the format XX.XX.X, so I should look for three numbers separated by periods. But I also know that sometimes there are more than two digits in each part (like 20.12 instead of just 04), so my script should handle varying lengths.

    I think using a regular expression would be the best approach here. Something like matching on \d+.\d+.\d+ should capture all three parts, regardless of their length. Then, I can split these into integers and compare them to find the maximum version number.

    Once I have the latest version, I need to extract the filename from it. The filenames after the info hash look like "ubuntu-{version}-desktop-amd64.iso". So, I'll need to parse this string to get just the version part.

    Next, constructing a curl command to download the file might be necessary. But since the user mentioned using wget, maybe they prefer that method instead of curl for simplicity. Alternatively, if the script is supposed to use curl, then it would involve combining the filename into a full URL and appending it to the curl command with options like -o for output.

    But in their example, they showed using wget directly, so perhaps the script can be written to take the info hash or name and construct the appropriate wget command. However, sometimes the filename isn't directly available on the site; you have to click through pages. That complicates things because the number of pages could vary, making it hard for a static script.

    Looking at their example output, there are 8 files listed. The latest one is "kubuntu-24.04.1-desktop-amd64.iso" with 402 complete downloads. So I need to identify that as the current LTS and perhaps update when new versions come out.

    Putting this all together, my script will:

    Parse each torrent row. Extract the info hash or filename. Use regex to find the version number in the filename. Convert it into an integer for comparison. Keep track of the latest version found. Once the latest version is determined, construct a wget command to download that specific file. Potential challenges include handling different release formats (like 20.12) and dealing with multiple pages on the torrent site. Additionally, ensuring that the script handles cases where there are no more recent releases than current would be important for accuracy.

    I think starting by writing a function to extract the version number from each filename is essential. Then, another part of the script can loop through all filenames, apply this function, and keep updating the latest version as needed.

    Finally, constructing the full URL with the correct path seems tricky because the torrent site uses "/file?info_hash=X" format. But since we have the info hash from each row, maybe it's better to use that for downloading rather than trying to parse the path from the name.

    In summary, the script will involve parsing HTML content, extracting version numbers, and using wget or curl to download the file with the latest version.

    To assist you in writing a bash script that downloads the latest Ubuntu LTS release, here is a step-by-step explanation of the process:

    Parse the HTML Content: Extract all torrent rows from the provided URL.

    Extract Version Numbers:

    Use regular expressions to identify and extract version numbers from each row's filename. The format appears to be ubuntu-{version}-desktop-amd64.iso. Identify the Latest Release:

    Convert the extracted strings into integers for comparison. Keep track of the highest version encountered. Construct the Download Command:

    Use the identified latest version's filename in a wget command to download the file directly from the torrent site. Here is the bash script implementing these steps:

     
        
    
    #!/bin/bash
    
    # Function to extract version number from a string
    extract_version($filename) {
        # Using regex to find all numbers followed by periods
        echo "$filename" | grep -o "[0-9]+\.([0-9]+)" | head -n 1 | sed 's/\./\1/p' | head -n 1
    }
    
    # Function to determine the latest version number from a list of filenames
    latest_version($filenames) {
        local max_version=0
        for file in $filenames; do
            version=$(extract_version "$file")
            if [[ "$version" > "$max_version" ]]; then
                max_version=$version
            fi
        done
        echo "$max_version"
    }
    
    # Function to construct the download command using wget
    download Latest release:
        local latest_version=($(
            latest_version "${Filenames[@]}"
        ))
    
        echo "Downloading from https://ipv6.torrent.ubuntu.com/file?info_hash=V%7BV%7D" \
            . "$version -o /path/to/download/directory/$version.deb"
    
    # Replace with actual file list after parsing
    # Filenames should be replaced with the actual files obtained from the torrent site.
    
    This script will help identify and download the latest available Ubuntu LTS release based on the version number extracted from the filenames. Make sure to replace Filenames with your parsed data containing the necessary information.
    
      
  • man bash is over 64,000 lines. we may also have different IQs, mine more suitable to asking profound philosophical questions to DeepSeek R1 (see https://lemmy.world/post/24838347 for example) and yours more suitable to learning bash scripting through electro-osmosis

    for everyone downvoting me, i have tried to get better at linux! don't think of me as a lazy linux user, think of me as among the best windows users who switched to linux

  • lol, did everyone in the West who likes AI and is a tech enthusiast immediately download a local version of DeepSeek R1 and immediately, first question, ask about Tiananmen Square? (I know I did.)