Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Comparising the non-integer.

9 views
Skip to first unread message

Hongyi Zhao

unread,
Apr 9, 2015, 6:58:22 AM4/9/15
to
Hi all,

As have discussed before in this list and shown in the mannual page of
test:

______________________

INTEGER1 -eq INTEGER2
INTEGER1 is equal to INTEGER2

INTEGER1 -ge INTEGER2
INTEGER1 is greater than or equal to INTEGER2

INTEGER1 -gt INTEGER2
INTEGER1 is greater than INTEGER2

INTEGER1 -le INTEGER2
INTEGER1 is less than or equal to INTEGER2

INTEGER1 -lt INTEGER2
INTEGER1 is less than INTEGER2

INTEGER1 -ne INTEGER2
INTEGER1 is not equal to INTEGER2

______________________

All the above comparisons can be done with the following forms, say, for -
le:

[ INTEGER1 -le INTEGER2 ]

or

[[ INTEGER1 -le INTEGER2 ]]

or

(( INTEGER1 <= INTEGER2 ))

But, my issue is: when want to comparison two non-integers, say,
decimals, irrationals, and etc., how should I use these arithmetic
operators?

Regards
--
.: Hongyi Zhao [ hongyi.zhao AT gmail.com ] Free as in Freedom :.

Janis Papanagnou

unread,
Apr 9, 2015, 7:03:58 AM4/9/15
to
On 09.04.2015 12:58, Hongyi Zhao wrote:
> [...]
>
> (( INTEGER1 <= INTEGER2 ))
>
> But, my issue is: when want to comparison two non-integers, say,
> decimals, irrationals, and etc., how should I use these arithmetic
> operators?

Use a shell that supports floating point arithmetic, like, e.g., ksh.
Then you can use these operators as you'd expect.

Janis

>
> Regards
>

Barry Margolin

unread,
Apr 9, 2015, 10:43:05 AM4/9/15
to
In article <mg5mar$8si$1...@news.m-online.net>,
Or just don't use a shell script. Use awk or perl, for instance.

--
Barry Margolin, bar...@alum.mit.edu
Arlington, MA
*** PLEASE post questions in newsgroups, not directly to me ***

Hongyi Zhao

unread,
Apr 9, 2015, 7:12:09 PM4/9/15
to
On Thu, 09 Apr 2015 10:42:58 -0400, Barry Margolin wrote:

> Or just don't use a shell script. Use awk or perl, for instance.

IMHO, for most people and most cases, only using awk/perl to solve the
issues under *nix/Linux are not common. Most of case is using them from
within shell or embedded in the shell script with some awk/perl scripts.

Essentially, all of the shell/awk/perl are languages, but shell is
especially efficient for integrating all of the tools under *nix/Linux
for system administration jobs, while awk/perl are only good at some
specific cases.

Kaz Kylheku

unread,
Apr 9, 2015, 7:55:55 PM4/9/15
to
On 2015-04-09, Hongyi Zhao <hongy...@gmail.com> wrote:
> On Thu, 09 Apr 2015 10:42:58 -0400, Barry Margolin wrote:
>
>> Or just don't use a shell script. Use awk or perl, for instance.
>
> IMHO, for most people and most cases, only using awk/perl to solve the
> issues under *nix/Linux are not common. Most of case is using them from
> within shell or embedded in the shell script with some awk/perl scripts.

Okay, then use a shell script with something embedded.

> Essentially, all of the shell/awk/perl are languages, but shell is
> especially efficient for integrating all of the tools under *nix/Linux
> for system administration jobs, while awk/perl are only good at some
> specific cases.

Yes, specific cases like ... working with non-integers!

Janis Papanagnou

unread,
Apr 10, 2015, 3:37:05 AM4/10/15
to
Am 10.04.2015 um 01:12 schrieb Hongyi Zhao:
> On Thu, 09 Apr 2015 10:42:58 -0400, Barry Margolin wrote:
>
>> Or just don't use a shell script. Use awk or perl, for instance.
>
> IMHO, for most people and most cases, only using awk/perl to solve the
> issues under *nix/Linux are not common.

The question should not be what's common, but rather what's the
appropriate way to solve a task.

> Most of case is using them from
> within shell or embedded in the shell script with some awk/perl scripts.
>
> Essentially, all of the shell/awk/perl are languages, but shell is
> especially efficient for integrating all of the tools under *nix/Linux
> for system administration jobs,

With shells you can (and often just have to) use external programs.
Combining many external tools may lead to severe efficience issues.

Actually, shells are specifically *inefficient* at doing certain
tasks. (And bash, specifically, a shell that is used by a lot of
people on Linux, is also ineffcient, if compared with, say, ksh.)

If you want to do math with, say, bash - a shell which does not
support floating point math - you have to use shell-external tools,
like (for example) bc; i.e. you'll have another overhead that makes
your task inefficient. Since you have to call an external process -
which is costly! - at every place where you'd need to do your math.

Your "efficiency" argument is not well thought through!

> while awk/perl are only good at some specific cases.

Awk is good for text processing, and pattern matching, and it has
floating point math integrated.

If your task needs a lot of external tool integration, and FP math,
and efficiency, then use ksh, as I proposed upthread.

Janis

>
> Regards
>

Barry Margolin

unread,
Apr 10, 2015, 1:22:13 PM4/10/15
to
In article <mg7106$cj8$1...@aspen.stu.neva.ru>,
Hongyi Zhao <hongy...@gmail.com> wrote:

> On Thu, 09 Apr 2015 10:42:58 -0400, Barry Margolin wrote:
>
> > Or just don't use a shell script. Use awk or perl, for instance.
>
> IMHO, for most people and most cases, only using awk/perl to solve the
> issues under *nix/Linux are not common. Most of case is using them from
> within shell or embedded in the shell script with some awk/perl scripts.
>
> Essentially, all of the shell/awk/perl are languages, but shell is
> especially efficient for integrating all of the tools under *nix/Linux
> for system administration jobs, while awk/perl are only good at some
> specific cases.

The way I see it, shell scripts are ideal when the primary goal is to
automate and integrate actions that are already provided in other Unix
commands. There are lots of tools for file manipulation, and shell
scripts are great at combining these.

But for intense manipulation of numerical and string data, shell scripts
are less than ideal. It's gotten better over time (we no longer have to
use `expr ...` to perform simple arithmetic, and there are ${var...}
operators for some common string operations), but it's still pretty
cumbersome. And the lack of floating point can be a deal-breaker.

Perl has extensive string and numeric capabilities, and also makes it
easy to invoke external commands when necessary. So if a script is
primarily doing data manipulation, and only secondarily interacting with
other commands, then it may be a better language.

Of course, it's all just a matter of personal preference.

Hongyi Zhao

unread,
Apr 11, 2015, 3:01:58 AM4/11/15
to
On Fri, 10 Apr 2015 09:37:00 +0200, Janis Papanagnou wrote:

> With shells you can (and often just have to) use external programs.
> Combining many external tools may lead to severe efficience issues.

Then what about the soul of philosophy for UNIX: program should only
focus on one goal and do it the best, and then let them work together
with each other.

Janis Papanagnou

unread,
Apr 11, 2015, 4:40:16 AM4/11/15
to
On 11.04.2015 09:01, Hongyi Zhao wrote:
> On Fri, 10 Apr 2015 09:37:00 +0200, Janis Papanagnou wrote:
>
>> With shells you can (and often just have to) use external programs.
>> Combining many external tools may lead to severe efficience issues.
>
> Then what about the soul of philosophy for UNIX: program should only
> focus on one goal and do it the best, and then let them work together
> with each other.

Short answer: You can follow a philosophy without thinking, or you can
implement sophisticated solutions; it's your choice.

A bit more formalized an answer is: Assume you have "basic tasks" that
are handled by a set of programs A, B, ..., Z, each an implemention for
a specific "solution domain", and you have a set of "glue operators"
a, b, ..., z. Each program and operator (the combination of programs)
has some costs (execution time, bandwidth, memory, etc.). Now some of
the programs have a larger application domain, covering more then one
topic. Instead of of an implementation of A z B y C x D w E you could
more efficiently do, say, C x F, or just use G.

Also mind that specialized tools in Unix handle usually every variant
of a solution for a specific solution domain, which are not generally
required for specific solutions; so another tool that you already use
in your solution could handle those tasks, so that you don't need to
add another process and communication glue between those processes.

In the current case of shell; why do you think there was an evolution
in shell development (WRT supported features of the technical problem
domain) from bourne shell to ksh and bash? Bourne shell was completely
capable of glueing external specialized programs together. One reason
is just efficiency. Another reason is that it's clumsy (or impossible
in some cases) to pass state information between external tools.

Just to be clear: The "Unix Philosophy" is not wrong. You just should
be aware about the options and implications.

Janis

>
> Regards
>

Dan Espen

unread,
Apr 11, 2015, 10:49:12 AM4/11/15
to
Janis Papanagnou <janis_pa...@hotmail.com> writes:

> Just to be clear: The "Unix Philosophy" is not wrong. You just should
> be aware about the options and implications.

The "Unix Philosophy" is a fabrication.

If this philosophy is so ingrained in Unix,
which man page is this philosophy mentioned in?

--
Dan Espen

Kaz Kylheku

unread,
Apr 11, 2015, 11:09:22 AM4/11/15
to
The documentation of this philosophy exists in the man pages as an "emergent
phenomenon", arising from the numerous small man pages that document simple
things in isolation, and reference each other.

Lew Pitcher

unread,
Apr 11, 2015, 11:17:37 AM4/11/15
to
On Saturday April 11 2015 10:49, in comp.unix.shell, "Dan Espen"
AFAIK, not in the man pages. But, in a plethora of supporting documentation
from AT&T / Bell Labs.

Like in the Preface to The Unix Programming Environment (Brian W Kernighan
and Rob Pike, AT&T Bell Labs, 1984), where the authors of Unix say
"Instead, what makes it [Unix] effective is an approach to programming, a
philosophy of using the computer. Although that philosophy can't be
written down in a single sentence, at its heart is the idea that the
power of a system comes more from the relationship among programs than
from the programs themselves. Many UNIX programs do quite trivial tasks
in isolation, but, combined with other programs, become general and
useful tools."

Or in "The Unix Time-Sharing System" (D.M. Richie and K. Thompson, Bell
Labs, 1974), where the authors of Unix say
"... there have always been fairly sever size constraints on the system
and its software. Given the partially antagonistic desires for reasonable
efficiency and expressive power, the size constraint has encouraged not
only economy, but also a certain elegance of design. This may be a thinly
disguised version of the "salvation through suffering" philosophy, but in
our case it worked."

Just two of the many /official/ places where the "Unix philosophy" is
discussed.

HTH
--
Lew Pitcher
"In Skills, We Trust"
PGP public key available upon request

Janis Papanagnou

unread,
Apr 11, 2015, 11:24:41 AM4/11/15
to
On 11.04.2015 16:49, Dan Espen wrote:
> Janis Papanagnou <janis_pa...@hotmail.com> writes:
>
>> Just to be clear: The "Unix Philosophy" is not wrong. You just should
>> be aware about the options and implications.
>
> The "Unix Philosophy" is a fabrication.

Sure. As every "philosophy" is a fabrication. What are you after?

You surely know what quotes around words mean? And you remember
what the question was?

>
> If this philosophy is so ingrained in Unix,
> which man page is this philosophy mentioned in?

The man pages describe how the Unux commands and functions operate.
(To my knowledge there's no manual document delivered with the OS
that would explain the design philosophy of Unix.) You know that.

(Anyway not sure what you're aiming at with that post. The other
poster clearly stated what he means by that term; his description
of that "Philosophy" is what's been taught since more than three
decades.)

Janis

frank.w...@gmail.com

unread,
Apr 11, 2015, 12:59:23 PM4/11/15
to
From LewPitcher:
>Like in the Preface to The Unix Programming Environment (Brian W Kernighan
>and Rob Pike, AT&T Bell Labs, 1984), where the authors of Unix say
> "Instead, what makes it [Unix] effective is an
approach to programming,
>a
> philosophy of using the computer. Although that philosophy can't be
> written down in a single sentence, at its heart is the idea that the
> power of a system comes more from the relationship among programs than
> from the programs themselves. Many UNIX programs do quite trivial tasks
> in isolation, but, combined with other programs, become general and
> useful tools."

I think this isolated piece of the discussion is about
using one program instead of a set of programs in
accordance with a philosophy, and the reply was that
often a single specialized program is better. Up above
BKW is suggesting what we should have a general purpose
tool set of interoperable tools, and that is a very good
suggestion, but he is not suggesting that we should not
use the right tool for the right job.

Frank

Dan Espen

unread,
Apr 11, 2015, 1:15:18 PM4/11/15
to
In other words, it's a product of imagination.

Ever see the TROFF man page?
(Today the groff man page.)
Is that a small simple tool?

TROFF (and NROFF) were one of the keys to early Unix success.

--
Dan Espen

Dan Espen

unread,
Apr 11, 2015, 1:17:59 PM4/11/15
to
Very interesting and I congratulate you on finding this stuff.

Note that the first says "Many UNIX programs", not all UNIX programs.

The second doesn't really advance a philosophy, but instead describes
constraints.


--
Dan Espen

Dan Espen

unread,
Apr 11, 2015, 1:22:58 PM4/11/15
to
I've been using UNIX since I first encountered it at Bell Labs
pretty early on.

I know that one of UNIX's strengths is pipelines feeding the output
of one command into the input of the other.
I've just never seen a document that says this is the UNIX philosophy
and all things must be this way to qualify as UNIX tools.

The way this "philosophy" argument is being used today
denies reality. Some things are just as complex as they need to be.

--
Dan Espen

Janis Papanagnou

unread,
Apr 11, 2015, 2:31:19 PM4/11/15
to
On 11.04.2015 19:15, Dan Espen wrote:
>
> TROFF (and NROFF) were one of the keys to early Unix success.

AFAIK, troff was the _reason_ why UNIX had been implemented.

Janis

Janis Papanagnou

unread,
Apr 11, 2015, 2:56:59 PM4/11/15
to
On 11.04.2015 19:22, Dan Espen wrote:
>
> I know that one of UNIX's strengths is pipelines feeding the output
> of one command into the input of the other.
> I've just never seen a document that says this is the UNIX philosophy
> and all things must be this way to qualify as UNIX tools.

Oh, that's right. The philosphy aspect is generally one of thinking
about how things work (or should work) (also artificial things) and
then taking that paradigm as example, calling it philosophy. There's
not much magic behind it. (Specifically philosophies are different
from [documented] designs. Even though there was a design in UNIX,
you can see a "philosophy" behind it.)

>
> The way this "philosophy" argument is being used today denies reality.

I disagree on that. The "philosphy argument" in question is used
today not different than in former times. Still valid in principle,
even though you can of course (as so often) find exceptions.

> Some things are just as complex as they need to be.

Sure.

Janis

Lew Pitcher

unread,
Apr 11, 2015, 3:16:37 PM4/11/15
to
On Saturday April 11 2015 14:31, in comp.unix.shell, "Janis Papanagnou"
Actually, roff ("runoff") was the name of the utility; nroff was "new roff",
troff was roff for typesetters, and ditroff was a device-independant troff.

Unix came first, as a benchtop experiment. For financial justification,
Richie, Thompson, etc sold AT&T on it as a document processing system, and
built roff for it. (ref: http://cm.bell-labs.com/who/dmr/notes.html)

Barry Margolin

unread,
Apr 11, 2015, 8:53:10 PM4/11/15
to
In article <mgbc7b$4kd$1...@dont-email.me>, Dan Espen <des...@verizon.net>
wrote:
It's not in man pages, but it was in many papers that explained the
design and evolution of Unix.

Barry Margolin

unread,
Apr 11, 2015, 9:01:37 PM4/11/15
to
In article <mgbkuc$6ge$2...@dont-email.me>, Dan Espen <des...@verizon.net>
wrote:

> Note that the first says "Many UNIX programs", not all UNIX programs.

So? If the philosophy isn't implemented totally consistently throughout
the system, does that invalidate it? Sometimes pragmatism takes
precedence.

Admittedly, as Unix has evolved, it has drifted further and further from
the purity of its early design. Lots of programs have extra built-in
features that obviate combining them with other tools. When you notice
that X and Y are frequently used together, it makes sense to combine
them in some way to make it easier (and often more efficient); that's
where we get things like the -z option to tar, so we don't have to pipe
to/from gzip every time.

Kaz Kylheku

unread,
Apr 11, 2015, 9:51:54 PM4/11/15
to
On 2015-04-11, Dan Espen <des...@verizon.net> wrote:
> Kaz Kylheku <k...@kylheku.com> writes:
>
>> On 2015-04-11, Dan Espen <des...@verizon.net> wrote:
>>> Janis Papanagnou <janis_pa...@hotmail.com> writes:
>>>
>>>> Just to be clear: The "Unix Philosophy" is not wrong. You just should
>>>> be aware about the options and implications.
>>>
>>> The "Unix Philosophy" is a fabrication.
>>>
>>> If this philosophy is so ingrained in Unix,
>>> which man page is this philosophy mentioned in?
>>
>> The documentation of this philosophy exists in the man pages as an "emergent
>> phenomenon", arising from the numerous small man pages that document simple
>> things in isolation, and reference each other.
>
> In other words, it's a product of imagination.
>
> Ever see the TROFF man page?
> (Today the groff man page.)
> Is that a small simple tool?

It pretty much is. It's a pretty dumb algorithm for processing input, that you
can learn in a day (and forget in five). The rest is just libraries of builtin
commands and macro libraries.

Because everything important in troff is delimited with a dot on a new line,
you can easily add custom processing by external tools, like your own scripts.
Just invent a start and end marker that your tool recognizes:

.bhak
your syntax here
.ehak

That's basically how the pic, tbl and eqn tools "integrate" into troff.
0 new messages