Skip to main content

Questions tagged [compression]

Shrinking (compressing) and restoring (decompressing) of data.

Filter by
Sorted by
Tagged with
0 votes
2 answers
154 views

I have a folder with around seventy subfolders, each containing a few tarballs which are nightly backups of a few directories (the largest being /home) from an old Raspberry Pi. Each is a full backup; ...
kj7rrv's user avatar
  • 261
0 votes
2 answers
243 views

If I defragment files on btrfs with the command btrfs filesystem defrag --step 1G file Everything is fine. A filefrag -v file clearly show, the extent count significantly decreased. Things are very ...
peterh's user avatar
  • 10.5k
2 votes
0 answers
131 views

My backup strategy currently primarily consists of daily backups of all of my machines with Borg Backup, stored on different storage devices in different locations, following the 3-2-1 strategy. These ...
PhrozenByte's user avatar
0 votes
0 answers
91 views

I tarred one file with: tar cf My-tarball.tar path/to/file.txt Then compressed it: gzip My-tarball.tar But when i decompress it and extract it gunzip My-tarball.tar.gz tar -xf My-tarball.tar the ...
BenjamimCS's user avatar
0 votes
1 answer
80 views

I have page_1.pnm, …, page_6.pnm, which represent 6 pages of a scanned document, all in gray PNM produced by scanimage and manually postprocessed with GIMP. The command convert $(for i in 1 2 3 4 5 6; ...
AlMa1r's user avatar
  • 1
1 vote
1 answer
245 views

Assume you have a PNM or PNG image file, gray or color. With ImageMagick, you wish to generate a possibly small PDF file from it without losing information. So far I though it is simply convert ...
AlMa1r's user avatar
  • 1
0 votes
1 answer
110 views

Assume an image opened in GIMP in Debian 12. From this image, you would like to create a single-page PDF file with maximum lossless compression. How? As of 2024-12-19, https://docs.gimp.org/en/gimp-...
AlMa1r's user avatar
  • 1
1 vote
0 answers
32 views

I have multiple tar.gz archives, the size of each archive is approximately 40 GB: v1.0-trainval01_blobs.tgz v1.0-trainval01_blobs.tgz ... v1.0-trainval10_blobs.tgz I can unpack each archive and get ...
Ars ML's user avatar
  • 11
0 votes
1 answer
155 views

I'm trying to create an archive e.g. archive.tar.gz inside the current working directory e.g. /builds/project/ without saving the archive.tar.gz inside archive.tar.gz. To prevent this I'm trying to ...
rosaLux161's user avatar
0 votes
1 answer
112 views

I am thinking of changing the filesystem on my data drive from ext4 to btrfs because btrfs can do compression and the storage space will run out at some point. I have seen that btrfs can do ...
user447274's user avatar
1 vote
1 answer
610 views

I am brand new to zstd/pzstd, trying out its features, compression, benchmarking it, and so on. (I run Linux Mint 22 Cinnamon.) This computer has 32 GB RAM. The basic command appears to be working, ...
Vlastimil Burián's user avatar
0 votes
1 answer
88 views

If we create a tar ball file by giving the following command tar -cvf Docs.tar $HOME/Documents/* then post creation of the tar ball is it possible to use gzip or bzip2 or xz or some other compression ...
KDM's user avatar
  • 128
1 vote
1 answer
115 views

I have a Z6 HDD pool with 6 x 18T drives and a SLOG, LZ4 compression is enabled by default. Now I need to store large amount of small files and I'm worried about fragmentation. The files: 70K files ...
7E10FC9A's user avatar
  • 121
1 vote
0 answers
221 views

I need to archive a lot of gzip-compressed data. The problem is that compared to zstd, gzip is wasteful, both in terms of ratio and CPU time required to decompress the data. Because of that, I want to ...
d33tah's user avatar
  • 1,298
0 votes
1 answer
114 views

I have to restore a single file from a fairly large (~ 1 TB) afio archive I created by using the following script (debug messages omitted): #!/bin/bash # SRCDIR=/bak BAKDEV=/dev/disk/by-partlabel/...
Neppomuk's user avatar
  • 364
0 votes
1 answer
272 views

Context I want to (try) unpack an MSI, Zip (or any archive), or EXE to inspect the contents. I also want to recursively try unpack all extracted files. Using 7zip, I have found I can unpack MSI, Zip, ...
d.j.yotta's user avatar
  • 101
0 votes
2 answers
94 views

folks, need a hand with unpacking a big batch of rars. Don't want to do this one-by-one. I have several directories inside subdirectories of 'dir' with many .rar's in each. dir subdir_a ...
Jehan Alvani's user avatar
0 votes
2 answers
319 views

I am trying to use tar to recursively compress all files with the .lammpstrj extension within the directory tree starting at the directory whose path is stored in the variable home. home contains the ...
Felipe Evaristo's user avatar
0 votes
1 answer
203 views

Basics I am running the Apache/2.4.62 on my Raspberry Pi 4B (ARM64/AArch64) with the Debian GNU/Linux 12 (bookworm). Some HW/OS info: # neofetch _,met$$$$$gg. root@rpi4 ,g$$$$$$$$...
Vlastimil Burián's user avatar
0 votes
2 answers
234 views

Continuing find a file within a tar.gz archive, how is it possible to automatically recursively search for files with given names, including the files inside the archives and compressed archives? For ...
AlMa1r's user avatar
  • 1
1 vote
2 answers
1k views

I'm not currently using compression with my btrfs-formatted disk, but am wondering how much space I'd save if I did enable it. Short of actually enabling compression on the disk and comparing the ...
Psychonaut's user avatar
1 vote
2 answers
908 views

Until now, I used to backup my data using tar with one of the LZMA compression options (--lzma, --xz or --lzip) I recently noticed that 7-Zip has been ported to Linux in 2021 (https://www.xda-...
ChennyStar's user avatar
  • 2,019
0 votes
1 answer
203 views

I created some pigz (parallel gzip) - home page - compressed archives of my SSD disk drives. (compiled version 2.8) I called one of them 4TB-SATA-disk--Windows10--2024-Jan-21.img.gz which says the ...
Vlastimil Burián's user avatar
10 votes
1 answer
2k views

I have noticed that Nautilus (GNOME Files) can extract some RAR files that cannot be extracted using free packages like unrar-free or file-roller via CLI, nor using GUI tools like Engrampa or ...
tagomago's user avatar
  • 103
0 votes
3 answers
170 views

I need to extract a specific folder from .tar.bz2 (34G). The issue is that it takes 1 hour. I guess that this is due to the compression. I guess that w/o compression extraction of a specific folder ...
pmor's user avatar
  • 757
1 vote
0 answers
411 views

I'm running zswap on 6.2.0-39-generic (Ubuntu 22.04, HWE). My understanding is that zswap intercepts pages marked for swap, compresses them (if possible) and stores them in a compressed section of the ...
user3012926's user avatar
0 votes
1 answer
240 views

I am working on an embedded Linux system (5.10.24), and I am using jffs2 as the rootfs. Now I changed the kernel configuration of jffs2 as follows, # CONFIG_JFFS2_FS_WBUF_VERIFY is not set # ...
wangt13's user avatar
  • 651
0 votes
1 answer
292 views

I want to compress multiple files at once (one command) but I don't want to put them into same directory or same file. How'd I do it?
achhainsan's user avatar
1 vote
1 answer
666 views

My system is running Ubuntu 22.04.3 LTS. I plug a USB drive into that system. That USB drive (/dev/sdb) contains an Ubuntu installation, mostly in an ext4 partition (/dev/sdb3). That installation ...
Ray Woodcock's user avatar
0 votes
0 answers
97 views

I have multiple directories and subdirectories, with files inside. I would like to recursively compress each file in these (sub)directories with xv (xz -9ze -T0) which are not .xv files yet, without ...
user avatar
2 votes
0 answers
57 views

I have a large number of .tif's coming out of ScanTailor. Is there a way that I might OCR those .tif's with tesseract, holding the OCR data separate from the images; then compress the images, and ...
Diagon's user avatar
  • 740
-1 votes
1 answer
99 views

LONG ago I wrote a backup script for our site and have updated it ever since. However, occasionally things go wrong and some of the older backups are now broken. In days gone by I had used the utility ...
Richard T's user avatar
  • 288
0 votes
1 answer
642 views

I am looking to backup a large amount (~400GB) of different types of files (50% images and videos, 30% audio, 20% text). I hesitate to give an amount by which I'd like the size to be reduced, since I ...
Lukas's user avatar
  • 77
2 votes
1 answer
198 views

On Ubuntu 22.04, I found these 2 methods will get different sha256 of archive_tgz tar czf /a/archive_tgz . tar czf /dev/stdout . | split -d -b 200M - /a/archive. && cat /a/archive.* > /a/...
Azreal's user avatar
  • 43
6 votes
1 answer
425 views

I need to download and decompress a file as quickly as possible in a very latency sensitive environment with limited resources (A VM with 1 cpu, 2 cores, 128MB RAM) Naturally, I tried to pipe the ...
Richard's user avatar
  • 113
1 vote
1 answer
972 views

Is there a recommended for how much to compress various subvolumes based on their use purpose? In my case, I have subvols @, @home, .snapshots/@root, .snapshots/@home, @srv, var/@log, var/@cache. This ...
squirrels's user avatar
  • 131
2 votes
1 answer
232 views

I have on my server more file rotate, but when I want to compress it, i Have an extension 1.gz, but I just want .gz These files are automatically rotate by the application: server.log.2023-03-16 ...
sam's user avatar
  • 21
1 vote
1 answer
1k views

Using FreeBSD 11.1-STABLE, I have a ZFS dataset configured with gzip-9 compression, as well as a recordsize of 8K. (This volume is for archiving of small files, not speed.) zfs get all pool02/...
s.co.tt's user avatar
  • 171
10 votes
1 answer
947 views

On my FreeBSD 13.2 system, the zless utility cannot view text files compressed with gzip or compress, warning that they may be binary files and then showing garbage if I say I want to see the contents ...
Kusalananda's user avatar
  • 356k
0 votes
1 answer
3k views

Here is an example of what I am trying to do: I have a folder (called 'dir') that contains the following: dir |_sub1.rar |_sub2.rar |_sub3.rar I will cd ~/ to dir and want to run a command that will ...
linuxuser24569's user avatar
13 votes
4 answers
2k views

I am looking for a way to compress swap on disk. I am not looking for wider discussion of alternative solutions. See discussion at the end. I have tried: Using compressed zfs zvol for swap is NOT ...
linuxlover69's user avatar
0 votes
1 answer
556 views

In Linux about compression files can be accomplished with the tar and zip/unzip commands, perhaps more. I know that is possible establish a password protection for security reasons. The question is: ...
Manuel Jordan's user avatar
0 votes
2 answers
115 views

Instead of using non-POSIX tools such as unxz, what POSIX utility can I use to decompress files with the .xz extension? Neither xz nor unxz is a POSIX command, so if I want to run only POSIX commands, ...
just_another_human's user avatar
0 votes
1 answer
239 views

I am trying to install an application needed for my university. Whenever I am trying to symlink the .desktop file from original folder, it somehow compresses it and then puts it in the .local/share/...
KnR's user avatar
  • 3
1 vote
1 answer
3k views

I have a gz archive but for some reason tar said that the format is incorrect even though I can double click it in mac Finder and extract it normally, and file command shows the same format just like ...
phuclv's user avatar
  • 2,452
0 votes
1 answer
186 views

$ sudo mkfs.btrfs -fL borgbackups /dev/vgxubuntu/borgbackups $ udisksctl mount -o compress=ztsd:15 -b /dev/mapper/vgxubuntu-borgbackups Error mounting /dev/dm-3: GDBus.Error:org.freedesktop.UDisks2....
eugenevd's user avatar
  • 156
0 votes
1 answer
255 views

I have tons of .Z compressed files scattered across various directores and need to see the size of the file within it. I don't plan on uncompressing all the .Z files. Is there a way to see the content ...
Steve237's user avatar
  • 103
0 votes
1 answer
351 views

Posting this since I could not find it using Google and this took weeks. When extracting using Winrar, I am getting ! Attempting to correct the invalid file or folder name ! Renaming C:\Users\noam.s\...
Gulzar's user avatar
  • 135
0 votes
1 answer
379 views

I have a .tar.gz as input and want to extract the first 128 MiB of it and output as a .tar.gz in a single command. I tried: sudo tar xzOf input.tar.gz | sudo dd of=output bs=1M count=128 iflag=...
JohnnyFromBF's user avatar
  • 3,606
1 vote
0 answers
150 views

Given A tar file with rootfs where all permissions and ownerships are set. Expected A tar file with content of subdirectory, let's say usr/local/lib where permissions and ownerships are retained. ...
Michał F's user avatar
  • 301

1
2 3 4 5
9