[OpenIndiana-discuss] safely cleanup pkg cache?
Andreas Wacknitz
A.Wacknitz at gmx.de
Fri Feb 26 20:07:07 UTC 2021
Am 23.02.21 um 08:00 schrieb Stephan Althaus:
> On 02/23/21 12:13 AM, Tim Mooney via openindiana-discuss wrote:
>> In regard to: Re: [OpenIndiana-discuss] safely cleanup pkg cache?,
>> Andreas...:
>>
>>> Am 21.02.21 um 22:42 schrieb Stephan Althaus:
>>>> Hello!
>>>>
>>>> The "-s" option does the minimal obvious remove of the corresponding
>>>> snapshot:
>>
>> My experience seems to match what Andreas and Toomas are saying: -s
>> isn't
>> doing what it's supposed to be doing (?).
>>
>> After using
>>
>> sudo beadm destroy -F -s -v <bename>
>>
>> to destroy a dozen or so boot environments, I'm down to just this
>> for boot environments:
>>
>> $ beadm list
>> BE Active Mountpoint Space Policy Created
>> openindiana - - 12.05M static
>> 2019-05-17 10:37
>> openindiana-2021:02:07 - - 27.27M static
>> 2021-02-07 01:01
>> openindiana-2021:02:07-backup-1 - - 117K static
>> 2021-02-07 13:06
>> openindiana-2021:02:07-backup-2 - - 117K static
>> 2021-02-07 13:08
>> openindiana-2021:02:07-1 NR / 51.90G static
>> 2021-02-07 17:23
>> openindiana-2021:02:07-1-backup-1 - - 186K static
>> 2021-02-07 17:48
>> openindiana-2021:02:07-1-backup-2 - - 665K static
>> 2021-02-07 17:58
>> openindiana-2021:02:07-1-backup-3 - - 666K static
>> 2021-02-07 18:02
>>
>>
>> However, zfs list still shows (I think) snapshots for some of the
>> intermediate boot environments that I destroyed:
>>
>> $ zfs list -t snapshot
>> NAME USED AVAIL
>> REFER MOUNTPOINT
>> rpool/ROOT/openindiana-2021:02:07-1 at install 559M - 5.94G -
>> rpool/ROOT/openindiana-2021:02:07-1 at 2019-05-17-18:34:55 472M -
>> 6.28G -
>> rpool/ROOT/openindiana-2021:02:07-1 at 2019-05-17-18:46:32 555K -
>> 6.28G -
>> rpool/ROOT/openindiana-2021:02:07-1 at 2019-05-17-18:48:56 2.18M -
>> 6.45G -
>> rpool/ROOT/openindiana-2021:02:07-1 at 2019-06-13-22:13:18 1015M -
>> 9.74G -
>> rpool/ROOT/openindiana-2021:02:07-1 at 2019-06-21-16:25:04 1.21G -
>> 9.85G -
>> rpool/ROOT/openindiana-2021:02:07-1 at 2019-08-23-16:17:28 833M -
>> 9.74G -
>> rpool/ROOT/openindiana-2021:02:07-1 at 2019-08-28-21:51:55 1.40G -
>> 10.8G -
>> rpool/ROOT/openindiana-2021:02:07-1 at 2019-09-12-23:35:08 643M -
>> 11.7G -
>> rpool/ROOT/openindiana-2021:02:07-1 at 2019-10-02-22:55:57 660M -
>> 12.0G -
>> rpool/ROOT/openindiana-2021:02:07-1 at 2019-11-09-00:04:17 736M -
>> 12.4G -
>> rpool/ROOT/openindiana-2021:02:07-1 at 2019-12-05-01:02:10 1.02G -
>> 12.7G -
>> rpool/ROOT/openindiana-2021:02:07-1 at 2019-12-20-19:55:51 788M -
>> 12.9G -
>> rpool/ROOT/openindiana-2021:02:07-1 at 2020-02-13-23:17:35 918M -
>> 13.3G -
>> rpool/ROOT/openindiana-2021:02:07-1 at 2021-01-21-02:27:31 1.74G -
>> 13.9G -
>> rpool/ROOT/openindiana-2021:02:07-1 at 2021-02-06-22:47:15 1.71G -
>> 18.8G -
>> rpool/ROOT/openindiana-2021:02:07-1 at 2021-02-07-06:59:02 1.22G -
>> 19.1G -
>> rpool/ROOT/openindiana-2021:02:07-1 at 2021-02-07-19:06:07 280M -
>> 19.3G -
>> rpool/ROOT/openindiana-2021:02:07-1 at 2021-02-07-19:08:29 280M -
>> 19.3G -
>> rpool/ROOT/openindiana-2021:02:07-1 at 2021-02-07-23:21:52 640K -
>> 19.1G -
>> rpool/ROOT/openindiana-2021:02:07-1 at 2021-02-07-23:23:46 868K -
>> 19.2G -
>> rpool/ROOT/openindiana-2021:02:07-1 at 2021-02-07-23:48:07 294M -
>> 19.3G -
>> rpool/ROOT/openindiana-2021:02:07-1 at 2021-02-07-23:58:44 280M -
>> 19.3G -
>> rpool/ROOT/openindiana-2021:02:07-1 at 2021-02-08-00:02:17 280M -
>> 19.3G -
>> rpool/ROOT/openindiana-2021:02:07-1 at 2021-02-21-06:24:56 3.49M -
>> 19.4G -
>>
>> Now I have to figure out how to map the zfs snapshots to the boot
>> environments that I kept, so that I can "weed out" the zfs snapshots
>> that I don't need.
>>
>> I appreciate all the discussion and info my question has spawned! I
>> didn't anticipate the issue being as complicated as it appears it is.
>>
>> Tim
>
> Hello!
>
> "beadm -s " destroys snapshots.
>
> "rpool/ROOT/openindiana-2021:02:07-1" is the filesystem of the current
> BE.
>
> i don't know why these snapshots are in there,
> but these are left there from the "pkg upgrade" somehow.
>
> I don't think that "beadm -s" is to blame here.
>
> Maybe an additional Parameter would be nice to get rid of old snaphots
> within the BE-filesystem(s).
>
> Greetings,
>
> Stephan
>
>
> _______________________________________________
> openindiana-discuss mailing list
> openindiana-discuss at openindiana.org
> https://openindiana.org/mailman/listinfo/openindiana-discuss
Hi,
I think I hit the bug again, even when using beadm destroy -s
╰─➤ zfs list -t snapshot
NAME USED AVAIL
REFER MOUNTPOINT
rpool1/ROOT/openindiana-2021:02:26 at 2021-02-22-16:33:39 489M - 26.5G -
rpool1/ROOT/openindiana-2021:02:26 at 2021-02-24-12:32:24 472M -
26.5G - <- only one snapshop here from Feb. 24th
rpool1/ROOT/openindiana-2021:02:26 at 2021-02-25-13:03:15 0 - 26.5G -
rpool1/ROOT/openindiana-2021:02:26 at 2021-02-25-13:03:50 0 - 26.5G -
rpool1/ROOT/openindiana-2021:02:26 at 2021-02-26-08:35:10 0 - 26.5G -
rpool1/ROOT/openindiana-2021:02:26 at 2021-02-26-08:35:57 0 - 26.5G -
rpool1/ROOT/openindiana-2021:02:26/var at 2021-02-22-16:33:39 682M -
1.99G -
rpool1/ROOT/openindiana-2021:02:26/var at 2021-02-24-12:32:24 653M -
1.99G -
rpool1/ROOT/openindiana-2021:02:26/var at 2021-02-25-13:03:15 632K -
2.00G -
rpool1/ROOT/openindiana-2021:02:26/var at 2021-02-25-13:03:50 130M -
2.12G -
rpool1/ROOT/openindiana-2021:02:26/var at 2021-02-26-08:35:10 691K -
2.07G -
rpool1/ROOT/openindiana-2021:02:26/var at 2021-02-26-08:35:57 178M -
2.25G -
╭─andreas at skoll ~
╰─➤ pfexec zfs destroy
rpool1/ROOT/openindiana-2021:02:26 at 2021-02-22-16:33:39
╭─andreas at skoll ~
╰─➤ pfexec zfs destroy
rpool1/ROOT/openindiana-2021:02:26/var at 2021-02-22-16:33:39
╭─andreas at skoll ~ <- Two older snapshots removed
╰─➤ beadm list
BE Active Mountpoint Space Policy Created
openindiana-2021:02:24 - - 23.70M static 2021-02-24 13:33
openindiana-2021:02:25 - - 14.08M static 2021-02-25 14:03
openindiana-2021:02:26 NR / 32.54G static 2021-02-26
09:35 <- Three
BE's, let's remove the oldest
╭─andreas at skoll ~
╰─➤ pfexec beadm destroy -s openindiana-2021:02:24
<- See, used with -s!
Are you sure you want to destroy openindiana-2021:02:24?
This action cannot be undone (y/[n]): y
Destroyed successfully
╭─andreas at skoll ~
╰─➤ beadm list
BE Active Mountpoint Space Policy Created
openindiana-2021:02:25 - - 14.08M static 2021-02-25
14:03 <- BE removed
openindiana-2021:02:26 NR / 32.41G static 2021-02-26 09:35
╭─andreas at skoll ~
╰─➤ beadm list -a
BE/Dataset/Snapshot Active
Mountpoint Space Policy Created
openindiana-2021:02:25
rpool1/ROOT/openindiana-2021:02:25 -
- 14.08M static 2021-02-25 14:03
openindiana-2021:02:26
rpool1/ROOT/openindiana-2021:02:26 NR
/ 32.41G static 2021-02-26 09:35
rpool1/ROOT/openindiana-2021:02:26/var at 2021-02-24-12:32:24 -
- 685.24M static 2021-02-24 13:32 <- This snapshot
also survived the beadm destroy -s command
rpool1/ROOT/openindiana-2021:02:26/var at 2021-02-25-13:03:15 -
- 654.72M static 2021-02-25 14:03
rpool1/ROOT/openindiana-2021:02:26/var at 2021-02-26-08:35:10 -
- 691K static 2021-02-26 09:35
rpool1/ROOT/openindiana-2021:02:26/var at 2021-02-26-08:35:57 -
- 177.52M static 2021-02-26 09:35
rpool1/ROOT/openindiana-2021:02:26 at 2021-02-24-12:32:24 -
- 502.54M static 2021-02-24 13:32 <- Snapshot
still there
rpool1/ROOT/openindiana-2021:02:26 at 2021-02-25-13:03:15 -
- 479.87M static 2021-02-25 14:03
rpool1/ROOT/openindiana-2021:02:26 at 2021-02-26-08:35:10 -
- 0 static 2021-02-26 09:35
rpool1/ROOT/openindiana-2021:02:26 at 2021-02-26-08:35:57 -
- 0 static 2021-02-26 09:35
Andreas
More information about the openindiana-discuss
mailing list