File Size
Jeff Kinz
jkinz at kinz.org
Sat Apr 16 16:32:55 UTC 2005
On Sat, Apr 16, 2005 at 12:29:33PM -0400, Jeff Kinz wrote:
> On Sat, Apr 16, 2005 at 09:48:48AM -0600, brad.mugleston at comcast.net wrote:
> > I've ripped all my CD's to a hard drive (about 8G of files) and
> > have discovered that not all of them ripped very well. I'm now
> > in the process of trying to sort out and either re-rip or delete
> > from my database (I use Grip and Digital DJ - fantastic
> > combination).
> >
> > I need a command - one line would be great - that would write to
> > a file all my files that are less than 130 bites in size. The
> > output would need the full path and file name and file size. I'm
> > thinking grep and ls and who knows what (in a prior life I could
>
> Hi Brad, the command you want is "find"
>
> for example print all the files with a "goo" in their name
> for the entire file tree starting at "/var" :
>
> find /var -name '*goo*' -print
>
> (Note: -print is actually redundant here.)
>
>
> Do a man find and look for special file attribute matching stuff to
> match on size. I don't know the option to match on size off the top of
> my head.
wup- here it is:
-size n[bckw]
File uses n units of space. The units are 512-byte blocks
by default or if `b' follows n, bytes if `c' follows n, kilobytes if `k'
follows n, or 2-byte words if `w' follows n. The size does not count
indirect blocks, but it does count blocks in sparse files that are not
actually allocated.
EXAMPLE:
find . -size -130c
I know its not as much fun as ls grep and awk, but what the heck.
--
http://kinz.org
http://www.fedoranews.org
Jeff Kinz, Emergent Research, Hudson, MA.
More information about the Redhat-install-list
mailing list