What is the difference between telnet and rlogin? They appear to
accomplish the same objective, that is, logging onto a remote host. I
can find no explanation comparing the two in my UNIX books.
Regards,
Dan
rlogin and rsh allow .rhosts / hosts.equiv authentication without
password. For stronger authentication (as well as encrypting the
data being passed between the computers), ssh could be used to
replace rlogin, rsh, and telnet.
--
------------------------------------------------------------------------
Timothy J. Lee timlee@
Unsolicited bulk or commercial email is not welcome. netcom.com
No warranty of any kind is provided with this message.
IMO, telnet is NOT made for telnetd only. There are many other services
to which you can telnet.
--
-------------------------------------------------------------------------------
-= Perly < pe...@xnet.com > =- BOFH in training + C maniac
"Prick your finger it is done,
the moon has now eclipsed the sun.
The angel has spread its wings,
the time has come for bitter things." - Antichrist "superstar"
Before Unix was networked, telnet existed to connect
two hosts on the Internet, no matter what O/S they
were running. Telnet works fine between a VAX and
a mainframe, as well as it does between two Unix
boxes.
Once Unix was networked, a Unix-only connection
mechanism was devised, and all of the "r-commands"
use it: rlogin, rsh, rdist, rcp, etc. Rlogin will
not work at all unless both machines run Unix, or
mimic running Unix at the connection level.
Telnet supports many options that are worthless
on Unix: EBCIDIC conversion, block transmit mode,
local echo mode, strange timeout settings for when
the satellite is out of range, etc.
Rlogin supports many options that are worthless
on non-Unix systems: enforced remote echo, single
character transmit mode, propagation of several
envirnoment variables, etc.
Telnet is an ATV; it can get you anywhere, but who
cares if you have roads all of the way to your
destination.
Rlogin is a car; it is easy to drive, but you don't
want to leave the roads.
Doug Freyburger, Collective Technologies, a Pencom company
IMO, ssh _should_ be used. Rlogin has known security problems, unless
you put all your trust in firewalls.
-Robin "Just Don't Ask Me What They Are" Powell <grin>
--
My Home Page (Too Much Information!):http://www.csclub.uwaterloo.ca/~rlpowell/
"Government is not suggestion nor persuasion, it is force. When you advocate
any government action, you first must believe that violence is the best answer
to the question at hand." -- Laws of the Jungle, by Allen Thornton
: IMO, ssh _should_ be used. Rlogin has known security problems, unless
: you put all your trust in firewalls.
Of course there is the suposition that perhaps the machines in question
are firewalled off to the intranet. Yet, even there I would agree, ssh
it, especially considering how many 'powerlusers' lurk there, not to
mention the fact that the highest percentage of security leaks are from
the inside-out...
Laterer,
Ron DuFresne
--
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
"Cutting the space budget really restores my faith in humanity. It
eliminates dreams, goals, and ideals and lets us get straight to the
business of hate, debauchery, and self-annihilation." -- Johnny Hart
***testing, only testing, and damn good at it too!***
OK, so you're a Ph.D. Just don't touch anything.
A problem I have with the "use ssh" crowd is the supposition that
the admin has that option. Yes, in an ideal world, the UNIX admin can
install software that they know will improve security without any
management consequences, but in the business world, with strange
vendor requirements, and management/co-admins that don't understand the
world of freely available software, ssh is out of the question.
I had enough problems getting gcc on some systems I once administered.
Even the senior UNIX admin at the site said it shouldn't be on, as there
was no multi-thousand dollar support contract on it, and no money was
available to get such a support contract on a C compiler as it wasn't
needed.
Now, if anyone had a sure fire way to overcome the haunting fear of
freely available software that has been at every corporation I've
worked at, I'd welcome it. :-) Some day, I'm going to walk into a
project/company, and find that less, pine, gcc, and at least three other
free tools have already been installed. Some day... (Or even just
an up to date version of Sendmail!)
The basic attitude I seriously encountered once at a job regarding
inside security threats was "internal security isn't a problem. If they
try to gain access to the computers, we fire them, end of story."
--
Mr. Alcourt alc...@execpc.com http://www.execpc.com/~alcourt/
"I may disagree with what you say, but I will defend unto the death
your right to say it." (Voltaire)
I am _not_ a manager but put yourself in their position. They bear
responsibility to _assure_ that the computing reasources are stable
and reliable and that people that use those computers can get their
job done. Who is responsible when something breaks? What happens
if unsupported software 'breaks'and a hour or days worth of work of
say 50, or a 100, or a 1000 people is lost?
The standard admin answer of ' oh I'll write a script and fix it' is
not reassuring to those that have to answer to higher ups that are
looking at the bottom line. Yes.. you might 'fix' it this time but what
happens next time? or when you're not there?
Lots of talk about NT and UNIX... usually in terms of function and
reliability. IMHO, UNIX is history once NT gets more reliable because
the way that UNIX systems are maintained in a lot of places makes
them _unreliable_ and part of this goes back to installation and
configuration of software applications. Yes.. we have CDE.. but
how many systems are maintained with 'out-of-the-box' apps?
UNIX folks need to take this issue seriously if they really want
UNIX to survive. Otherwise.. they're gonna be back-end servers
with out-of-box apps like ORACLE and the such.
Excuse me but making a comparison between the reliability of UNIX and NT is
about as absurd as using the very words reliability and NT in the same
sentence. I fail to see how NT (which cannot even drive more than 4 CPU on a
system) will replace UNIX - even if it should, one day, become a reliable
operating system. It would fail drastically on the issue of scalability
alone. The paradigm is much the same as those who saw UNIX as a threat to the
mainframes way back when - mainframes are still doing a very brisk business
today as will UNIX in the years to come.
The problem of reliability, IMHO, comes from IT managers not understanding
the paradigm shift between desktop systems and serious server systems. This
results in under-staffing and of insufficiently qualified personnel taking on
the responsibilities of managing networks of computer systems running very
powerful operating systems. There is no excuse for a well managed, well
maintained network of UNIX systems being unreliable. I can find nobody to
stand by the same statement regarding a network of NT systems - coincidence?
You be the judge.
I don't necessarily have a problem with NT. Each platform has its place (be it
desktop systems, server systems, mainframe systems or niche systems). I just
don't see NT being any kind of a threat to UNIX.
Best regards,
Chris Morgan
-----------== Posted via Deja News, The Discussion Network ==----------
http://www.dejanews.com/ Search, Read, Discuss, or Start Your Own
If the management cannot or does not want to trust their admins
either the management or the admins ought to be changed.
>The standard admin answer of ' oh I'll write a script and fix it' is
>not reassuring to those that have to answer to higher ups that are
>looking at the bottom line. Yes.. you might 'fix' it this time but what
>happens next time? or when you're not there?
If "you" is not there than no kind of support is going to help since
there is no-one around who's got the permissions to fix it.
>Lots of talk about NT and UNIX... usually in terms of function and
>reliability. IMHO, UNIX is history once NT gets more reliable because
>the way that UNIX systems are maintained in a lot of places makes
>them _unreliable_ and part of this goes back to installation and
>configuration of software applications. Yes.. we have CDE.. but
>how many systems are maintained with 'out-of-the-box' apps?
>UNIX folks need to take this issue seriously if they really want
>UNIX to survive. Otherwise.. they're gonna be back-end servers
>with out-of-box apps like ORACLE and the such.
Unix boxes have uptimes longer than the release cycles of NT. Yes,
NT is management, Unix is about getting the job done. Aside from
that any kind of system can be messed up by someone how does not
know enough about it. It is just that M$ systems have the capability
to do it themselves built in.
Cheers,
Juergen
--
\ Real name : Jürgen Heinzl \ no flames /
\ EMail Private : jue...@monocerus.demon.co.uk \ send money instead /
> I am _not_ a manager but put yourself in their position. They bear
> responsibility to _assure_ that the computing reasources are stable
> and reliable and that people that use those computers can get their
> job done. Who is responsible when something breaks? What happens
> if unsupported software 'breaks'and a hour or days worth of work of
> say 50, or a 100, or a 1000 people is lost?
> The standard admin answer of ' oh I'll write a script and fix it' is
> not reassuring to those that have to answer to higher ups that are
> looking at the bottom line. Yes.. you might 'fix' it this time but what
> happens next time? or when you're not there?
> Lots of talk about NT and UNIX... usually in terms of function and
> reliability. IMHO, UNIX is history once NT gets more reliable because
> the way that UNIX systems are maintained in a lot of places makes
> them _unreliable_ and part of this goes back to installation and
> configuration of software applications. Yes.. we have CDE.. but
> how many systems are maintained with 'out-of-the-box' apps?
> UNIX folks need to take this issue seriously if they really want
> UNIX to survive. Otherwise.. they're gonna be back-end servers
> with out-of-box apps like ORACLE and the such.
Larry,
if you live in the beleif that the computer industry ever will deliver
turn-key systems with zero administration, well that's up to you.
The experience with all machinery more comlicated then apples and oranges
is that more or less skilled experts will "provide the glue"
between design and practice.
And so long this (local)expertize is denied the possibility to do
it's work, you will have limping / non-optimal computers.
See it as a possibility instead, a local sysadmin may very well
install local tools. Their soure may be home-made, Shareware
or commercial. The origin is not importent. What is importent
is documentation and good "workmanship practice".
The key may very well be called ISO-9000 standards!!
A good sysadmin will deliver working products. And good products
are maintanable by any good sysadmin.
Bad products on the other hands are maintanied by "black magic"
and service packs, (non-specific fixes to all kind of problems).
And, to be frank, many products delivered by GNU et.al has
a higher quality then their commercial counterparts. So why
limit your site to sub-quality stuff ?
--
--
Peter HÃ¥kanson Phone +46 0708 39 23 04
Network Management AB Fax +46 031 779 7844
Email : use peter (at) gbg (dot) netman (dot) se No copy to sanford wallace!
What happens is supported software "breaks" and an hour or day's worth
of work is lost while you wait for the vendor to try and find a
solution?
The old "free software == unsupported software" argument is such
rubbish. Most third party support is expensive and unresponsive. Usually
in the time it takes to get past hold on a commercial support line you
could have posted something on the appropriate USENET group and got the
answer yourself. There's way more support available for, say, Apache
than for Netscape Enterprise. And if your free software is something
really obscure, there's a good chance you're only an e-mail away from
the person who wrote it, who will be only too pleased to hear from
someone who uses his baby.
I've seen (more than once) situations where applying a patch to fix one
vendor supplied problem (be it OS or application) breaks another app. If
that app came as source then there's a reasonable chance that a rebuild
on the newly patched system will get it working again. If you bought a
binary, better hope that whoever sold it to you has got there first and
fixed it.
> The standard admin answer of ' oh I'll write a script and fix it' is
> not reassuring to those that have to answer to higher ups that are
> looking at the bottom line. Yes.. you might 'fix' it this time but what
> happens next time? or when you're not there?
Hang on, does M$ tech support come out and fix your machine in the
middle of the night now? Your sys-admin's not there sixteen hours a day
(if they're lucky). What do you do the rest of the time?
> Lots of talk about NT and UNIX... usually in terms of function and
> reliability. IMHO, UNIX is history once NT gets more reliable because
> the way that UNIX systems are maintained in a lot of places makes
> them _unreliable_
Sure, there are a lot of second-rate UNIX sys-admins about. They're
usually developers (or whatever) who have been forced in to doing the
job because the management won't pay to hire a specialist. If that's
your management policy you deserve all the downtime you get. But there
are also a lot of newly MCSE qualified charlatans out to make a fast
buck with no background, experience or real interest in what they are
doing. A little knowledge is indeed a dangerous thing. Especially when
you're running computers.
> and part of this goes back to installation and
> configuration of software applications. Yes.. we have CDE..
Don't remind me!
> but
> how many systems are maintained with 'out-of-the-box' apps?
Why is that a good thing? No maintenence tool is as valuable as an
understanding of what the system is really doing. There will always come
a time when you need to get "under the hood" with vi, so why not be
prepared?
> UNIX folks need to take this issue seriously if they really want
> UNIX to survive. Otherwise.. they're gonna be back-end servers
> with out-of-box apps like ORACLE and the such.
Isn't that what Unix has always been best at? NT is better for the
average desktop user, there's no point pretending otherwise.
Rob
nope.. competent and skilled support will always be needed.
> turn-key systems with zero administration, well that's up to you.
>
> The experience with all machinery more comlicated then apples and oranges
> is that more or less skilled experts will "provide the glue"
> between design and practice.
any engineered system.. way beyond computers, will require competent support.
>
> And so long this (local)expertize is denied the possibility to do
> it's work, you will have limping / non-optimal computers.
>
> See it as a possibility instead, a local sysadmin may very well
> install local tools. Their soure may be home-made, Shareware
> or commercial. The origin is not importent. What is importent
> is documentation and good "workmanship practice".
> The key may very well be called ISO-9000 standards!!
I agree with this but ISO is similar to SEI Level 5 which does NOT
describe many sites methods of runninng their computer operations.
> A good sysadmin will deliver working products. And good products
> are maintanable by any good sysadmin.
a good sysadmin FUNCTION will do this. An individual will not
do this when they are not there. The function does it as long as
is needed. Individuals come and go.
> Bad products on the other hands are maintanied by "black magic"
> and service packs, (non-specific fixes to all kind of problems).
>
> And, to be frank, many products delivered by GNU et.al has
> a higher quality then their commercial counterparts. So why
> limit your site to sub-quality stuff ?
>
software and systems are no longer for 'geeks' who just get
a kick out of doing it. Corporate fortunes..even peoples lives
depend on it. 'OOPS' type incidents are no longer just oops
in many cases. Those injured are starting to demand compensation
for damages.
It really don't matter how good software or system maintenance is
generated but it does matter than it be reliable and dependable -
ergo... systems - no matter what flavor.. that 'oops' are rapidly
falling out of favor. IMHO, UNIX has the most potential to be
bulletproof but it also has the most vulnerabilities because it can be
configured so many different ways.. and still sorta work. NT comes
out of the box with more limitations ( and right now it is more unreliable)
but the 'concept' of it coming out of the box and working properly is
the concept that those who are responsible for reliable systems are after.
I heard in a talk about Java a few months back that Windows 2000 (which
I believe is the replacement for both Windows 95/98 and NT) has
approximately 30 to 50 million lines of code (!). I speculate that the
Windows OS is getting bulkier at a faster rate than the unix-like (i.e.
including Linux) OS's. If that is true, Windows doesn't have a chance
of long-term survivability in the server end of the machine spectrum
simply because it will become too difficult to manage by the end-user
and by Microsoft itself. IMHO, it still is excellent as a single-user
desktop OS (e.g. word-processing, spreadsheets, etc.). (Include MacOS
in that last sentence, too.)
--
--------------------------------------
Brian Galloway
Hewlett-Packard
Larry C. Gross NSWCDD K74 Room. 120 wrote:
> > A problem I have with the "use ssh" crowd is the supposition that
> > the admin has that option. Yes, in an ideal world, the UNIX admin can
> > install software that they know will improve security without any
> > management consequences, but in the business world, with strange
> > vendor requirements, and management/co-admins that don't understand the
> > world of freely available software, ssh is out of the question.
>
> I am _not_ a manager but put yourself in their position. They bear
> responsibility to _assure_ that the computing reasources are stable
> and reliable and that people that use those computers can get their
> job done. Who is responsible when something breaks? What happens
> if unsupported software 'breaks'and a hour or days worth of work of
> say 50, or a 100, or a 1000 people is lost?
>
> The standard admin answer of ' oh I'll write a script and fix it' is
> not reassuring to those that have to answer to higher ups that are
> looking at the bottom line. Yes.. you might 'fix' it this time but what
> happens next time? or when you're not there?
>
> Lots of talk about NT and UNIX... usually in terms of function and
> reliability. IMHO, UNIX is history once NT gets more reliable because
> the way that UNIX systems are maintained in a lot of places makes
> them _unreliable_ and part of this goes back to installation and
> configuration of software applications. Yes.. we have CDE.. but
> how many systems are maintained with 'out-of-the-box' apps?
> UNIX folks need to take this issue seriously if they really want
> UNIX to survive. Otherwise.. they're gonna be back-end servers
> with out-of-box apps like ORACLE and the such.
I think your article is funny. Very funny in that NT can simply not be put
in a sentence with reliable. Not with the "reboot solution" attached to it
as a means of problem resolution.
I also fail to see your premise for the prediction of the demise of the UNIX
and UNIX-like OSs' as addition of security modules such as SSH and TCP
wrapper are simple and non conflicting with applications. To put it simply,
run a daemon, kill a daemon; add a line, comment out the line.
If you require a "real" UNIX Admin to help you, there are whole communities
of us waiting to help. Need help in the microsoft world? You better have a
support contract!
Jide
Network Operations/engineering/security personnel
Prism Communications Inc
Comdisco Inc.
Transwire Communications Inc
Unto every Operating System, give its due.
>I think your article is funny. Very funny in that NT can simply not be put
>in a sentence with reliable. Not with the "reboot solution" attached to it
>as a means of problem resolution.
I'm not an advocate of NT.. I'm an advocate of stable and reliable
systems that provide a consistent resource to those that depend on it.
Both UNIX and NT often fail this test. IMHO, NT because it is still
evolving as an OS.. and UNIX because it can be configured a zillion
different ways.. and often is... on-the-fly by admins who utilize
their own judgement and unsupported shareware to 'improve' the system
- often at the expense of those that depend on it to get their job
done.
>I also fail to see your premise for the prediction of the demise of the UNIX
>and UNIX-like OSs' as addition of security modules such as SSH and TCP
>wrapper are simple and non conflicting with applications. To put it simply,
>run a daemon, kill a daemon; add a line, comment out the line.
daemons are nothing more than code that references data and
executables on the system. Fiddling with the daemons when you
are not sure of other possible interactions is just plain foolish.
UNIX as an OS is a legitimate and competent OS and has demonstrated
that it can handle client-server duties and remain scalable. The
weakness is that it can be configured so many different ways such that
applications written for it often don't work properly if they depend
on the out-of-box configuration because that configuration has been
altered. The most successful applications in UNIX are those that
maintain control of their own resources and don't depend on the
system configuration. This is especially true of x applications. CDE
offers great promise as a standard as long as admins are not tempted
to alter it in ways that would make in incompatible with COTS
software. Shareware often depends on customized configuration changes
in order to work.. or worse.. they don't take into account that
configuration settings can be different than how the app was
originally developed and tested.
>
>If you require a "real" UNIX Admin to help you, there are whole communities
>of us waiting to help. Need help in the microsoft world? You better have a
>support contract!
"real" UNIX admins focus on the importance of a stable and reliable
computing resource AND being able to maintain it without having it
go down and become unuseable because of what are essentially
system configuration issues. Read this newsgroup.. and the problems
that people report and need help with.. they are almost all due to
system configuration issues that often result from site-customized
changes and/or shareware that often does not have a configuration spec
- the spec that tells you what systems that it has been specifically
coded and tested for... and then installed on systems where it becomes
known that there are compatibility issues.... then 'creative' ( highly
skilled I'm sure) folks attempt 'bug' fixes on their own. Even if
successful.. try to migrate to the next version of the OS without
problems or try to build a new one just like the one you have online
and you find.. quite often.. that so many ad-hoc changes have been
made to the previous system that it is almost impossible to re-create
it on a new system. This is not a legitimate way of maintaining
systems. The frequent outcome of this kind of administration is
systems that no longer provide important functions to the people
that use the machines. In the end.. every time something like this
happens.. it convinces those in charge that UNIX is a difficult
and unpredictable OS with little accountability when something goes
wrong and worse.. if your UNIX guru is not around.. you ( the company)
is in deep doo doo. NT has similar problems... more often with
shareware but as an OS.. it actually LIMITs configuration changes...
a _weakness_ in trying to deploy solutions to users but a major
_strength_ when it keeps people from 'fiddling' with the system
when they shouldn't be.
To make it short.
If you are able to find (and hire) a skilled and competent system
administrator you are much better of with a UNIX system.
If you can't find a competent system administrator and therefore are
stuck with a moron, stick with Micro$hit. Call it damage control ;-)
/bart
--
caffeine low .... brain halted
That's a problem with incompetent administration. A determined
person can break anything. Any sufficiently powerful tool will
let you shoot yourself in the foot.
>UNIX as an OS is a legitimate and competent OS and has demonstrated
>that it can handle client-server duties and remain scalable. The
>weakness is that it can be configured so many different ways such that
Well, that "weakness" is also its strength. Unix won't go away
until something similar and better comes along (probably not even then),
and no matter how stable it gets, NT just ain't it. It's not a matter
of relative stability, it's a matter of design philosophy.
I don't see Unix going away, I see it evolving. The clueless will
switch to NT, thinking it will allow them to point and click their
way to Nirvana. That's fine with me -- I have no plans to work for
a clueless employer and widespread use of NT for anything other than
workstations tips me off right away.
I think before NT matures enough even some of these people will
start to realize that the emperor has no clothes.
--
Craig Johnston
c...@lfn.org
> In article <37340cee....@news.erols.com>,
> larry gross <lgr...@pobox.com> wrote:
> >On Fri, 07 May 1999 00:54:11 -0400, Otuyelu Jide
> >
> >I'm not an advocate of NT.. I'm an advocate of stable and reliable
> >systems that provide a consistent resource to those that depend on it.
> >Both UNIX and NT often fail this test. IMHO, NT because it is still
> >evolving as an OS.. and UNIX because it can be configured a zillion
> >different ways.. and often is... on-the-fly by admins who utilize
> >their own judgement and unsupported shareware to 'improve' the system
> >- often at the expense of those that depend on it to get their job
> >done.
I think that you will find that a 'good' systems administrator will add,
remove, install and upgrade any bits of software that they
fthink will be 'beneficial' to their .... and here's the keyword, SYSTEM.
With a systemic point of view in mind, you could add just about any piece
of software and configure it in any way possible provided that, by doing
so, you do NOT detriment the system.
From personal experience, I have seen a (large) number of UNIX systems of
all flavours (SGI, Sun, HP etc.) where some doofus has thought
"I know. I'll put a webserver on here!" even though the system concerned
does not have a great amount of memory resource (physical or virtual).
They then take the "out of the box" configuration, max all the values they
can (to give 'better performance') and then wonder why the system
crawls along while they have 150+ child webserver processes hanging around,
unused for over the last 30+ minutes.
A "good" systems administrator should be able to take ANY piece of viable
software and either A> get it to run on their target system or B> be able
to know and explain just exactly why this software should NOT be put on
that system.
> That's a problem with incompetent administration. A determined
> person can break anything. Any sufficiently powerful tool will
> let you shoot yourself in the foot.
....I could say that a 'good' system administrator can "break anything" if
they are determined to, as they will be able to see the major imperfections
in a system. I think you ned to distinguish between "haplessly breaking"
and "willfully breaking" here!!! *grin*
> I don't see Unix going away, I see it evolving. The clueless will
> switch to NT, thinking it will allow them to point and click their
> way to Nirvana. That's fine with me -- I have no plans to work for
> a clueless employer and widespread use of NT for anything other than
> workstations tips me off right away.
I don't think that that is necessarily true. Companies like RedHat (no
insults intended here, OK??) have taken Linux and, through the addition of
a nice X11-based front-end and sheds-loads of documentation, made it so
that 'administering' your own Unix system is "easy". The clueless will
see that and think "Unix with a point'n'click interface!!! Great!!!". The
problem here is not whether the OS here attracts dummies, but how to
prevent
dummies from getting "in too deep".
That was the good thing about the clunky old UNIX interface. If you wanted
to find out, you HAD to go through that. If you were just a casual user,
you'd probably get as far as nethack and then logout. If you were
committed, then you stayed with it.
> I think before NT matures enough even some of these people will
> start to realize that the emperor has no clothes.
I think that people generally 'attract' towards what they first experience.
I have always been a 'preferer' of UNIX, having started on UNIX systems
and while I LURVE playing Quake and stuff like that on a PC, I still would
rather Linux-up an NT-box and THEN stick it on the net than take a Linux'ed
PC and stick NT on it and go that way onto the net.
One thing I believe that I have observed, on several occasions, is what I
would describe as a 'vacantness' in those who have come into this game
solely through the Windoze/NT route. The number of times I have sat and
spoken with people who are 'NT heads' about things like web interfaces and
system daemons and all I have had is this blank look of disbelief come
back, followed by the look that they think that "YOU are a nut, you don't
know what you're talking about".
Windows and NT 'breeds', IMHO, a culture of 'smart monied ignorance'. You
get your system, you slam a stack of software on there, you hammer in a few
plug'n'play cards and.... when the system no longer works right, you go out
and buy a bigger and better system and carry on as before until the next
time. If you're "serious" about computing, you do the same and when the
system breaks, you get out your credit card and phone a help-line who will
charge you $5/minute for 30 minutes just to eventually tell you to either
A> back out the last X packages you installed, B> re-install your base
level operating system or C> go out a buy a bigger, better system.
I don't think that NT will go away. I just think that those of us (and I
include myself in the next group) who have "learned computing the hard way"
will happily carry on, picking up cheap, discounted hardware and software
and making beautiful systems that we can charge $5/minute for the
Windoze-kiddies to access!!!!
Take care and have fun,
Steff
>> >I'm not an advocate of NT.. I'm an advocate of stable and reliable
>> >systems that provide a consistent resource to those that depend on it.
>> >Both UNIX and NT often fail this test. IMHO, NT because it is still
>> >evolving as an OS.. and UNIX because it can be configured a zillion
>> >different ways.. and often is... on-the-fly by admins who utilize
>> >their own judgement and unsupported shareware to 'improve' the system
>> >- often at the expense of those that depend on it to get their job
>> >done.
>
>I think that you will find that a 'good' systems administrator will add,
>remove, install and upgrade any bits of software that they
>fthink will be 'beneficial' to their .... and here's the keyword, SYSTEM.
on target
>With a systemic point of view in mind, you could add just about any piece
>of software and configure it in any way possible provided that, by doing
>so, you do NOT detriment the system.
judgement of admin is issue. who decides, what criteria, and how do
you insure a solid install that stays reliable across OS upgrades?
>
>From personal experience, I have seen a (large) number of UNIX systems of
>all flavours (SGI, Sun, HP etc.) where some doofus has thought
>"I know. I'll put a webserver on here!" even though the system concerned
>does not have a great amount of memory resource (physical or virtual).
>They then take the "out of the box" configuration, max all the values they
>can (to give 'better performance') and then wonder why the system
>crawls along while they have 150+ child webserver processes hanging around,
>unused for over the last 30+ minutes.
often cited as an NT disease
>
>A "good" systems administrator should be able to take ANY piece of viable
>software and either A> get it to run on their target system or B> be able
>to know and explain just exactly why this software should NOT be put on
>that system.
exactly and it should be in the form of a recommendation to a team or
supervisor not a personal decision.
>
>> That's a problem with incompetent administration. A determined
>> person can break anything. Any sufficiently powerful tool will
>> let you shoot yourself in the foot.
>
>....I could say that a 'good' system administrator can "break anything" if
>they are determined to, as they will be able to see the major imperfections
>
>in a system. I think you ned to distinguish between "haplessly breaking"
>and "willfully breaking" here!!! *grin*
yup... we have a spare asset, made available by a new procurement.
Instead of surplusing it..we now use it as a guiny pig for new
installs of software and patches that we don't feel we fully
understand or trust. We NEVER live-update a production machine during
prime shift. I was shocked when I first joined this organization
because that is exactly what they did routinely.. with all the
wonderful fallout that results from it. So we had a user base that was
cynical about machine reliability and did not trust the admins any
further than they could throw them. This is not a professional
approach.
>
>> I don't see Unix going away, I see it evolving. The clueless will
>> switch to NT, thinking it will allow them to point and click their
>> way to Nirvana. That's fine with me -- I have no plans to work for
>> a clueless employer and widespread use of NT for anything other than
>> workstations tips me off right away.
it's a reality .. brought about because UNIX apps for office
automation are dismal even though they had a 20 year head start on the
PCs. StarOffice is NOW touted as the solution. It's not bad..but give
me a break.
>
>I don't think that that is necessarily true. Companies like RedHat (no
>insults intended here, OK??) have taken Linux and, through the addition of
>a nice X11-based front-end and sheds-loads of documentation, made it so
>that 'administering' your own Unix system is "easy". The clueless will
>see that and think "Unix with a point'n'click interface!!! Great!!!". The
>problem here is not whether the OS here attracts dummies, but how to
>prevent
>dummies from getting "in too deep".
there is no substitute for skill AND experience. Bright highly skilled
people are no guarantee that they won't use incredibly bad judgement.
>
>That was the good thing about the clunky old UNIX interface. If you wanted
>to find out, you HAD to go through that. If you were just a casual user,
>you'd probably get as far as nethack and then logout. If you were
>committed, then you stayed with it.
>
>
>> I think before NT matures enough even some of these people will
>> start to realize that the emperor has no clothes.
>
>I think that people generally 'attract' towards what they first experience.
>I have always been a 'preferer' of UNIX, having started on UNIX systems
>and while I LURVE playing Quake and stuff like that on a PC, I still would
>rather Linux-up an NT-box and THEN stick it on the net than take a Linux'ed
>PC and stick NT on it and go that way onto the net.
>
>One thing I believe that I have observed, on several occasions, is what I
>would describe as a 'vacantness' in those who have come into this game
>solely through the Windoze/NT route. The number of times I have sat and
>spoken with people who are 'NT heads' about things like web interfaces and
>system daemons and all I have had is this blank look of disbelief come
>back, followed by the look that they think that "YOU are a nut, you don't
>know what you're talking about".
they don't believe because the apps they want and use are not on UNIX
and most of what they hear from others users about the useability of
UNIX is bad. I have friends in the private sector.. some are in
charge. You should hear their comments about UNIX and UNIX dweebs that
are supposedly responsible for deploying reliable and useable
systems. Who wants some smart ass holding you and the rest of the
company hostage to their own personal inclinations? OUTSOURCE and NT
have become a reality because of this.
>
>Windows and NT 'breeds', IMHO, a culture of 'smart monied ignorance'. You
>get your system, you slam a stack of software on there, you hammer in a few
>plug'n'play cards and.... when the system no longer works right, you go out
>and buy a bigger and better system and carry on as before until the next
>time. If you're "serious" about computing, you do the same and when the
>system breaks, you get out your credit card and phone a help-line who will
>charge you $5/minute for 30 minutes just to eventually tell you to either
>A> back out the last X packages you installed, B> re-install your base
>level operating system or C> go out a buy a bigger, better system.
Companies need 'turn-key' solutions.. not someone who is holding
things together with strings and bubble gum.. then takes a vacation
during their busiest times. Companies want accountability and a
'warrantly'. It's a very open question if they can get it from NT. We
have an NT server at work that serves MS apps via a product known as
Wincenter on x-terms. It works GREAT .. for a time.. and then the blue
screen of death rears it's ugly head. Support? yes.. they charge...is
our problem fixed? No. So.. the anti-NTers are correct.. only the
problem extends to UNIX also.
>
>I don't think that NT will go away. I just think that those of us (and I
>include myself in the next group) who have "learned computing the hard way"
>will happily carry on, picking up cheap, discounted hardware and software
>and making beautiful systems that we can charge $5/minute for the
>Windoze-kiddies to access!!!!
Folks who actually know what they are doing AND have half a brain in
terms of understanding business requirements are in big demand. Others
need not apply.
...or more correctly, you can telnet to more than one port.
telnet and rlogin connect to two different ports. If you comment out
the lines in /etc/inted.conf, you prevent the machine from listening to
those ports. If you look
/etc/services, you can see what ports are assigned to what. The lines
for exec,
login and shell are for rlogin and rsh.
A good example is to try to telnet to a machine with sendmail running on
it.
Since sendmail typically uses the smtp protocol, it usually listens on
port 25.
telnet host.domain.com 25
On the other hand, rlogin will not allow you to select a port.
Noal
--
"Wise men talk because they have something to say;
fools, because they have to say something"
-Plato
--== Sent via Deja.com http://www.deja.com/ ==--
---Share what you know. Learn what you don't.---