Skip to content

Commit

Permalink
get rid of trailing whitespace and extra newlines
Browse files Browse the repository at this point in the history
Co-authored-by: Dan Church <amphetamachine@gmail.com>
  • Loading branch information
Paul Dreik and h3xx committed Apr 6, 2024
1 parent 343a271 commit 607a52a
Show file tree
Hide file tree
Showing 5 changed files with 20 additions and 22 deletions.
1 change: 0 additions & 1 deletion Makefile.am
Original file line number Diff line number Diff line change
Expand Up @@ -49,4 +49,3 @@ man_MANS = rdfind.1
#for formatting the source
format:
./do_clang_format.sh

8 changes: 4 additions & 4 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -55,7 +55,7 @@ Look for duplicate files in directory /home/pauls/bilder:
It seems like you have 100 files that are not unique
Totally, 24 Mib can be reduced.
Now making results file results.txt

It indicates there are 100 files that are not unique. Let us examine them by looking at the newly created results.txt:

$ cat results.txt
Expand Down Expand Up @@ -88,7 +88,7 @@ Rdfind uses the following algorithm. If N is the number of files to search throu
11. Sort on device and inode(speeds up file reading). Perform a checksum calculation for each file.
12. Only keep files on the list with the same size and checksum. These are duplicates.
13. Sort list on size, priority number, and depth. The first file for every set of duplicates is considered to be the original.
14. If flag ”-makeresultsfile true”, then print results file (default).
14. If flag ”-makeresultsfile true”, then print results file (default).
15. If flag ”-deleteduplicates true”, then delete (unlink) duplicate files. Exit.
16. If flag ”-makesymlinks true”, then replace duplicates with a symbolic link to the original. Exit.
17. If flag ”-makehardlinks true”, then replace duplicates with a hard link to the original. Exit.
Expand Down Expand Up @@ -130,9 +130,9 @@ The following methods are used to maintain code quality:
- clang format is used, issue make format to execute it
- cppcheck has been run manually and relevant issues are fixed
- [disorderfs](https://packages.debian.org/sid/disorderfs) is used (if available) to verify independence of file system ordering

There is a helper script that does the test build variants, see do_quality_checks.sh in the project root.

## Alternatives

There are some interesting alternatives.
Expand Down
20 changes: 10 additions & 10 deletions configure.ac
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ dnl See LICENSE for further details.
VERSION="1.6.next"
AC_INIT([rdfind],[1.6.next])
AC_CONFIG_SRCDIR([rdfind.cc])
AC_CONFIG_HEADERS([config.h])
AC_CONFIG_HEADERS([config.h])

AM_INIT_AUTOMAKE

Expand Down Expand Up @@ -37,19 +37,19 @@ AC_HEADER_ASSERT
dnl test for nettle
AC_CHECK_HEADER(nettle/sha.h,,[AC_MSG_ERROR([
nettle header files missing. Please install nettle
first. If you have already done so and get this error message
anyway, it may be installed somewhere else, maybe because you
first. If you have already done so and get this error message
anyway, it may be installed somewhere else, maybe because you
don't have root access. Pass CPPFLAGS=-I/your/path/to/nettle to configure
and try again. The path should be so that \#include "nettle/sha.h" works.
On Debian-ish systems, use "apt-get install nettle-dev" to get a system
wide nettle install.
])])
])])
AC_CHECK_LIB(nettle,nettle_pbkdf2_hmac_sha256,,[AC_MSG_ERROR([
Could not link to libnettle. Please install nettle
first. If you have already done so; please run ldconfig
as root or check whether the path libnettle was installed
to is in your LD_LIBRARY_PATH. If you have nettle
somewhere else, maybe because you don't have root
to is in your LD_LIBRARY_PATH. If you have nettle
somewhere else, maybe because you don't have root
access, pass LDFLAGS=-L/your/path/to/nettle to configure
and try again.
])])
Expand Down Expand Up @@ -125,8 +125,8 @@ if test "x$set_more_warnings" != xno; then
-Wzero-as-null-pointer-constant \
-Wparentheses \
-Wdate-time \
-Wextra-semi \
-Wbiznafronck"
-Wextra-semi \
-Wbiznafronck"

#these warnings were tried but deemed unuseful for this project:
# -Wunsafe-loop-optimizations \
Expand Down Expand Up @@ -170,7 +170,7 @@ AC_COMPILE_IFELSE([AC_LANG_PROGRAM([[
[has_fallthrough=yes],
[has_fallthrough=no])
AC_MSG_RESULT($has_fallthrough)
if test $has_fallthrough = yes; then
if test $has_fallthrough = yes; then
AC_DEFINE(FALLTHROUGH, [str], "support for c++17 fallthrough")
else
AC_DEFINE(FALLTHROUGH, [], "support for c++17 fallthrough")
Expand All @@ -179,7 +179,7 @@ fi
dnl valgrind support
dnl unfortunately it makes checking much slower
dnl even if valgrind is not used, so leave it inactive.
dnl AX_VALGRIND_CHECK
dnl AX_VALGRIND_CHECK
dnl
dnl instead, just use "VALGRIND=valgrind make check" to test with valgrind
dnl read Makefile.in and write Makefile
Expand Down
12 changes: 6 additions & 6 deletions rdfind.1
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@
.SH NAME
rdfind \- finds duplicate files
.SH SYNOPSIS
.B rdfind [ options ]
.B rdfind [ options ]
.I directory1 | file1
.B [
.I directory2 | file2
Expand All @@ -16,15 +16,15 @@ rdfind \- finds duplicate files
.B rdfind
finds duplicate files across and/or within several directories. It calculates
checksum only if necessary.
rdfind runs in O(Nlog(N)) time with N being the number of files.
rdfind runs in O(Nlog(N)) time with N being the number of files.

If two (or more) equal files are found, the program decides which of
them is the original and the rest are considered duplicates. This
is done by ranking the files to each other and deciding which has the
highest rank. See section RANKING for details.

By default, no action is taken besides creating a file with the
detected files and showing the possible amount of saved space.
detected files and showing the possible amount of saved space.

If you need better control over the ranking than given, you can use
some preprocessor which sorts the file names in desired order and then
Expand All @@ -39,7 +39,7 @@ Given two or more equal files, the one with the highest rank is
selected to be the original and the rest are duplicates. The rules of
ranking are given below, where the rules are executed from start until
an original has been found. Given two files A and B which have equal
size and content, the ranking is as follows:
size and content, the ranking is as follows:

If A was found while scanning an input argument earlier than B, A
is higher ranked.
Expand Down Expand Up @@ -110,7 +110,7 @@ General options:
.BR \-sleep " " \fIX\fRms
Sleeps X milliseconds between reading each file, to reduce
load. Default is 0 (no sleep). Note that only a few values are
supported at present: 0,1-5,10,25,50,100 milliseconds.
supported at present: 0,1-5,10,25,50,100 milliseconds.
.TP
.BR \-n ", " \-dryrun " " \fItrue\fR|\fIfalse\fR
Displays what should have been done, don't actually delete or link
Expand Down Expand Up @@ -146,7 +146,7 @@ DUPTYPE_WITHIN_SAME_TREE files in the same tree (found when processing
the directory in the same input argument as the original)

DUPTYPE_OUTSIDE_TREE the file is found during processing another input
argument than the original.
argument than the original.
.SH ENVIRONMENT
.SH DIAGNOSTICS
.SH EXIT VALUES
Expand Down
1 change: 0 additions & 1 deletion release_new_version.txt
Original file line number Diff line number Diff line change
Expand Up @@ -28,4 +28,3 @@ echo $(sha1sum < $pkg |cut -f1 -d' ')" (SHA1)" >>table.txt
echo $(sha256sum < $pkg |cut -f1 -d' ')" (SHA256)" >>table.txt

man2html rdfind.1 |tail -n +3 > rdfind.1.html

0 comments on commit 607a52a

Please sign in to comment.