I just installed apt cacher ng for catching my apt upgrade packages and saw a huge time improvement even though I have a good internet connection. It act as a proxy and caches the response packages.

Do you run something similar? Or maybe even run a local repo mirror? Warning, they are really big so I don’t think it is recommended unless you really need almost everything.

  • vegetaaaaaaa@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    9 months ago

    I want to look into apt-cacher-ng for learning purposes, to stop 10s of VMs in my homelab from adding load to Debian official repos, and also to check if there is a way to have it only mirror a list of “approved” packages.

    saw a huge time improvement even though I have a good internet connection

    Note that for best performance you should use https://deb.debian.org/

    Semi-related I have set up a personal APT repository on gitlab pages: https://nodiscc.gitlab.io/toolbox/ (I think Ubuntu users would call that a “PPA”). It uses aptly and a homegrown Makefile/Gitlab CI-based build system (sources/build tools are linked from the page). I wouldn’t recommend this exact setup for critical production needs, but it works.

    • TCB13@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      edit-2
      9 months ago

      And… you can also convert the ISO files into an hosted repository in your network using Apache:

      apt install apache2 build-essential mkdir /var/www/html/packages

      Now, create additional directories under /var/www/html/packages/ to save packages depending upon your system’s architecture. For example, create a directory “amd64″. You can keep multiple directories and serve packages to different architecture systems at the same time.

      mkdir /var/www/html/packages/amd64

      Copying all DEB files from Debian installation media

      Mount the first CD/DVD and copy all .deb packages to /var/www/packages/amd64/ directory from your CD/DVD.

      mount /dev/cdrom /media/cdrom find /media/cdrom/pool/ -name “*.deb” -exec cp {} /var/www/html/packages/amd64 ;

      After copying all deb files, unmount the first DVD using the following command.

      umount /media/cdrom

      Again mount all remaining CD/DVD one by one and copy the .deb files as shown above.

      To verify the files, navigate to http://192.168.1.150/packages/amd64/ from your browser. You will see all packages of your Debian DVD’s. Here 192.168.1.150 is my Debian server’s IP address.

      Index of -packages-amd64 - Google Chrome_002 Create Catalog file

      Switch to your repository directory i.e /var/www/html/packages/amd64/ :

      cd /var/www/html/packages/amd64/

      and enter the following command to create a catalog file for APT use. You should run this command so that Synaptic Manager or APT will fetch the packages from our local repository. Otherwise the packages in your local repository will not be shown in Synaptic and APT.

      dpkg-scanpackages . /dev/null | gzip -9c > Packages.gz

      This command will scan all deb files and create the local repository in your Debian server. This may take a while depending upon the number of packages in your local repository folder. Be patient or grab a cup of coffee.

      Sample output:

      dpkg-scanpackages: warning: Packages in archive but missing from override file: dpkg-scanpackages: warning: accountsservice acl acpi acpi-support-base acpid adduser adwaita-icon-theme apache2-bin apg apt apt-listchanges apt-offline apt-utils aptitude aptitude-common aptitude-doc-en aspell aspell-en at at-spi2-core avahi-daemon

      […]

      xserver-xorg-video-neomagic xserver-xorg-video-nouveau xserver-xorg-video-openchrome xserver-xorg-video-r128 xserver-xorg-video-radeon xserver-xorg-video-savage xserver-xorg-video-siliconmotion xserver-xorg-video-sisusb xserver-xorg-video-tdfx xserver-xorg-video-trident xserver-xorg-video-vesa xserver-xorg-video-vmware xterm xwayland xz-utils yelp yelp-xsl zenity zenity-common zlib1g

      dpkg-scanpackages: info: Wrote 1151 entries to output Packages file.

      Please note that whenever you add a new deb file in this repository, you should run the above command to create catalog file.

      Done! We created the catalog file. Configure Server sources list

      After creating the catalog file, go to your server(local) system. Open /etc/apt/sources.list file.

      nano /etc/apt/sources.list

      Comment out all lines and add your APT repository location as shown below.

      deb file:/var/www/html/packages/amd64/ /

      Configure Clients

      After creating the catalog file, go to your client systems. Open /etc/apt/sources.list file.

      vim /etc/apt/sources.list

      Add the server repository location as shown below. Comment out all sources list except the local repository.

      deb http://192.168.1.150/packages/amd64/ /

      Note: Put a space between deb and http://192.168.1.150/packages/amd64/ and /.

      • MioOP
        link
        fedilink
        English
        arrow-up
        1
        ·
        9 months ago

        The dvds are fine for offline use. But I dont know how to keep them updated. Probably result in taking loads of spaces as I guess they are equal to a repo mirror

        • TCB13@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          9 months ago

          So what are you using for a local repository mirror? apt-mirror or ftpsync? I usually keep ISOs for the architectures that interest me using jigdo as it can update them later on.

          ISOs are harder to maintain for sure but they’re more standalone and might survive adversities better.

    • MioOP
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      9 months ago

      I use it with Kubuntu. Doing apt update is now much faster. I did some testing and found some good public mirror so I could max my connection(100 Mbit) with about 15ms latency to the server. But I think the problem was there are so many small files. Running nala to fetch the files in parallel helps of course. With apt local ng I don’t need nala at all. The low latency and files on gigabit connection to my server leads to fast access. Just need to find a good way to fill it with new updates.
      A second problem is to figure out if something can be done to speed up the apt upgrade, which I guess is not possible. Workaround with snapshots and send diff does not sound efficient either, even on older hardware.

      apt update - 4 seconds vs 16 seconds.

      apt upgrade --download-only - 10 seconds vs 84 seconds;

    • TCB13@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      9 months ago

      Do you know you can use the ISO files as repositories? Easier in some situations.

      1. Create the folders (mountpoint) to mount the ISO files
      sudo mkdir -p /media/repo_1
      sudo mkdir -p /media/repo_2
      sudo mkdir -p /media/repo_3
      
      1. Mount the ISO files
      	sudo mount -o loop ~/Downloads/debian-8.0.0-amd64-DVD-1.iso /media/repo_1/
      	sudo mount -o loop ~/Downloads/debian-8.0.0-amd64-DVD-2.iso /media/repo_2/
      	sudo mount -o loop ~/Downloads/debian-8.0.0-amd64-DVD-3.iso /media/repo_3/
      
      1. Edit the /etc/apt/sources.list file to add the repository
      vim /etc/apt/sources.list
      
      	deb file:///media/repo_1/  jessie main contrib
      	deb file:///media/repo_2/  jessie main contrib
      	deb file:///media/repo_3/  jessie main contrib
      
      1. Run sudo apt-get update