Usually my process is very… hammer and drill related - but I have a family member who is interested in taking my latest batch of hard drives after I upgraded.

What are the best (linux) tools for the process? I’d like to run some tests to make sure they’re good first and also do a full zero out of any data. (Used to be a raid if that matters)

Edit: Thanks all, process is officially started, will probably run for quite a while. Appreciate the advice!

  • TiTeY`@lemmy.home.titey.net
    link
    fedilink
    English
    arrow-up
    19
    ·
    11 months ago

    Usually, I use shred:

    shred -vfz -n 2 /dev/device-name
    
    • -v: verbose mode
    • -f: forces the write permissions if missing
    • -z: zeroes the disk in the final pass
    • -n 2: 2 passes w/ random data
    • FaceButt9000@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      11 months ago

      Shred is what I used when destroying a bunch of old drives.

      Then I disassbled them to pull out the magnets and platters (because they’re shiny and cool). A couple had torx screws that I didn’t have the right end for so I ran a hdd magnet over the surface and scratched them with a screwdriver.

      • tburkhol@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        11 months ago

        I have an inch-high stack of platters now. Kind of interesting to see how their thickness has changed over the years, including a color change in there somewhere. Keep thinking I should bury them in epoxy on some table top.

        For extra fun, you ca melt the casings and cast interesting shapes. I only wish I were smart enough to repurpose the spindle motors.

        • Tangent5280@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          11 months ago

          Make sure you wear lung protection when you deal with those. They’re terrible for your insides.

    • Vilian@lemmy.ca
      link
      fedilink
      English
      arrow-up
      1
      ·
      11 months ago

      why don’t just zeroes from the start?, instead of using random data and them zeroes it?

      • MeanEYE@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        edit-2
        11 months ago

        Like u/MrMcGasion said, zeroing makes it easier to recover original data. Data storage and signal processing is pretty much a game of threshold values. From digital world you might see 0 or 1, but in reality it’s a charge on a certain scale, lets assume 0 to 100%. Anything above 60% would be considered 1 and anything below 45% a 0. Or something like that.

        When you do zero the drive, that means drive will reduce charge enough to pass the lower limit, but it will not be 0 on any account. With custom firmware or special tools it is possible to configure this threshold and all of the sudden it is as if your data was never removed. Add to this situation existence of checksums and total removal of data becomes a real challenge. Hence why all these tools do more than one operation to make sure data is really zeroed or removed.

        For this reason random data is better approach is much better than zeroing because random data alters each block differently instead of just reducing charge by a fixed amount, as it is with zeroing. Additional safety is achieved by multiple random data writes.

        All of this plays a role only on magnetic storage, that is to say HDDs. SSD is a completely different beast and wiping SSD can lead to reduced lifespan of the drive without actually achieving the desired result. SSDs have write distribution algorithms which make sure each of the blocks are equally used. So while your computer thinks it’s writing something at the beginning of the drive, in reality that block can be anywhere on the device and address is just internally translated to real one.

      • MrMcGasion@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        11 months ago

        Just doing a single pass all the same like zeroes, often still leaves the original data recoverable. Doing passes of random data and then zeroing it lowers the chance that the original data can be recovered.

        • Moonrise2473@feddit.it
          link
          fedilink
          English
          arrow-up
          5
          ·
          11 months ago

          The “can” in can be recovered means “if a state sponsored attacker thinks that you have nuclear secrets on that drive, they can spend millions and recover data by manually analyzing the magnetic flux in a clean room lab” not “you can recover it by running this program”