Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Router/thermostat interference?

787 views
Skip to first unread message

Tim+

unread,
Jul 26, 2012, 10:19:19 AM7/26/12
to
My mother recently bought an iPad. In order to let her go on line it in the
front room of her house I moved her router into the hall.

Since the move, the wifi connection to any device seems to disappear every 5
minutes or so for about a minute. This didn't happen before I moved the
router.

I noticed today that she has a wireless thermostat also installed nearby in
the hall (only about 4' away). It's labelled "British Gas RC Plus".

is it possible that the thermostat is causing the problem? The loss of
signal even happens when the CH/HW is off but I guess the thermostat might
still be "chirping" to the boiler regularly.

I should add that she lives in a tennement flat and there are quite a lot of
other wi-fi networks around. I have tried changing the channel once but
this made no obvious difference.

Tim

Paul D Smith

unread,
Jul 26, 2012, 10:39:10 AM7/26/12
to
"Tim+" <timdow...@nospampleaseyahoo.co.uk> wrote in message
news:jurjla$hcf$1...@dont-email.me...
Can you whip the battery out of the thermostat and see if that helps?

Also, can you borrow a PC with WiFi and see what channels are in use around
you? Try to be 3 channels different with yours.

Also, some routers have an "if you see interference, automatically change
channel" feature and this can give the same symptoms. I rely on a manual
check and lock down my channel otherwise SWMBO complains when US radio
(she's a yank) drops out during her favourite streamed radio show.

Paul DS.

The Natural Philosopher

unread,
Jul 26, 2012, 11:03:46 AM7/26/12
to
+1 to all of this. Rf stats use the same band as wifi.


--
To people who know nothing, anything is possible.
To people who know too much, it is a sad fact
that they know how little is really possible -
and how hard it is to achieve it.

Jim

unread,
Aug 10, 2012, 11:26:39 AM8/10/12
to
When I read that statement, I checked my own (Honeywell) and the spec
says 868MHz. If the OP's is a Drayton (as I read in a search result),
it will operate on 433MHz. Neither should interfere with Wi-Fi at 2.4GHz.

After complaints from a user in my own Mum's house, I took my laptop to
the seat outside the front door and found 17 networks - this in a low
density area with a lot of elderly neighbours. If you let your signal
level drop, it's likely to be swamped by low-level noise from other
users, especially in a tenement where routers may be only a few metres
apart.

Graham J

unread,
Aug 10, 2012, 12:13:35 PM8/10/12
to
[snip]
>
> After complaints from a user in my own Mum's house, I took my laptop to
> the seat outside the front door and found 17 networks - this in a low
> density area with a lot of elderly neighbours. If you let your signal
> level drop, it's likely to be swamped by low-level noise from other
> users, especially in a tenement where routers may be only a few metres
> apart.

Fundamentally wireless technology is doomed, simply because everybody
wants it. Domestic powerline networking might well go the same way.

The solution is to use cat5 cable.

Pity about all those people who bought hand-held devices lacking an RJ45
connector ...!

--
Graham J


The Natural Philosopher

unread,
Aug 10, 2012, 1:39:03 PM8/10/12
to
3rd harmonic is close at 3x868 Mhz or 2.604Ghz and ths of 433 is close
as well.



> After complaints from a user in my own Mum's house, I took my laptop to
> the seat outside the front door and found 17 networks - this in a low
> density area with a lot of elderly neighbours. If you let your signal
> level drop, it's likely to be swamped by low-level noise from other
> users, especially in a tenement where routers may be only a few metres
> apart.
>


--
Ineptocracy

(in-ep-toc’-ra-cy) – a system of government where the least capable to
lead are elected by the least capable of producing, and where the
members of society least likely to sustain themselves or succeed, are
rewarded with goods and services paid for by the confiscated wealth of a
diminishing number of producers.

The Natural Philosopher

unread,
Aug 10, 2012, 1:41:47 PM8/10/12
to
I cant even get wifi from this office here to the kitchen about 6m line
of sight - if there was a line of sight,..too much foil backed
plasterboard in the way..

Ne[hew has been and gibe and parked his car near the oil tank. Bang went
the fuel gauge signal as well,.

Ian Jackson

unread,
Aug 10, 2012, 2:24:29 PM8/10/12
to
In message <k03h4r$fee$6...@news.albasani.net>, The Natural Philosopher
<t...@invalid.invalid> writes
>Graham J wrote:
>> [snip]
>>>
>>> After complaints from a user in my own Mum's house, I took my laptop to
>>> the seat outside the front door and found 17 networks - this in a low
>>> density area with a lot of elderly neighbours. If you let your signal
>>> level drop, it's likely to be swamped by low-level noise from other
>>> users, especially in a tenement where routers may be only a few metres
>>> apart.
>> Fundamentally wireless technology is doomed, simply because
>>everybody wants it. Domestic powerline networking might well go the
>>same way.
>> The solution is to use cat5 cable.
>> Pity about all those people who bought hand-held devices lacking an
>>RJ45 connector ...!
>>
>I cant even get wifi from this office here to the kitchen about 6m line
>of sight - if there was a line of sight,..too much foil backed
>plasterboard in the way..
>
>Ne[hew has been and gibe and parked his car near the oil tank. Bang
>went the fuel gauge signal as well,.
>
There are 13 x 2.4GHz wifi channels.
http://en.wikipedia.org/wiki/List_of_WLAN_channels
If your laptop has the facility, do what is (on my wife's laptop) a
'site survey'. This does a spectrum analysis of the wifi signals it is
receiving, and it identifies them. See which channels are not being used
(or where the signal is weak), and if possible, change your wifi link to
one of those.
--
Ian

Woody

unread,
Aug 10, 2012, 3:18:20 PM8/10/12
to
"Ian Jackson" <ianREMOVET...@g3ohx.demon.co.uk> wrote in
message news:qTyiQmFd...@g3ohx.demon.co.uk...
Better still if you have an Android smartphone you can download a
free wi-fi viewing app which not only shows the signal strength
and channel(s) being used but also shows the SSID. Immensely
useful.


--
Woody

harrogate three at ntlworld dot com


tony sayer

unread,
Aug 10, 2012, 4:53:43 PM8/10/12
to
In article <k03gvn$fee$3...@news.albasani.net>, The Natural Philosopher
<t...@invalid.invalid> scribeth thus
I very much doubt that there will be any energy in a 3rd or any other
come to that from those pidley power levels!..

>
>
>
>> After complaints from a user in my own Mum's house, I took my laptop to
>> the seat outside the front door and found 17 networks - this in a low
>> density area with a lot of elderly neighbours. If you let your signal
>> level drop, it's likely to be swamped by low-level noise from other
>> users, especially in a tenement where routers may be only a few metres
>> apart.
>>
>
>
Try a crowded street in Cambridge. Most everyone has got one sometimes
more even;!....
--
Tony Sayer

The Natural Philosopher

unread,
Aug 10, 2012, 5:12:47 PM8/10/12
to
You might be surprised..they aren't exactly plastered in filters.

Be worth sticking an analyser on if you ever get time..

>>
>>
>>> After complaints from a user in my own Mum's house, I took my laptop to
>>> the seat outside the front door and found 17 networks - this in a low
>>> density area with a lot of elderly neighbours. If you let your signal
>>> level drop, it's likely to be swamped by low-level noise from other
>>> users, especially in a tenement where routers may be only a few metres
>>> apart.
>>>
>>
> Try a crowded street in Cambridge. Most everyone has got one sometimes
> more even;!....


--

PeeGee

unread,
Aug 11, 2012, 3:40:12 AM8/11/12
to
Not always that easy. A few months ago, every time I "moved" my signal
another unit (presumably set to auto) switched - there was no
unauthorised access through my unit. Also, both my neighbour and I have
11n units which use two channels (300Mbps capable using channels 1+5 and
11+7 - mine doesn't have 12 or 13 available) so we cover the spectrum
between us :-(

--
PeeGee

"Nothing should be able to load itself onto a computer without the
knowledge or consent of the computer user. Software should also be able
to be removed from a computer easily."
Peter Cullen, Microsoft Chief Privacy Strategist (Computing 18 Aug 05)


Roderick Stewart

unread,
Aug 11, 2012, 6:17:08 AM8/11/12
to
In article <qTyiQmFd...@g3ohx.demon.co.uk>, Ian Jackson wrote:
> There are 13 x 2.4GHz wifi channels.
> http://en.wikipedia.org/wiki/List_of_WLAN_channels

That's the real problem. It's like IPV4 or the "millennium bug", or in
other words a lack of foresight. It's something that worked in the
technical sense at the time it was invented, but for which nobody
considered the implications of real people wanting to use it.

> If your laptop has the facility, do what is (on my wife's laptop) a
> 'site survey'. This does a spectrum analysis of the wifi signals it is
> receiving, and it identifies them. See which channels are not being used
> (or where the signal is weak), and if possible, change your wifi link to
> one of those.

For Windows or Linux there is a utility called "InSSIDer", and for Android
there's one called "Wifi Analyser".

Rod.
--
Virtual Access V6.3 free usenet/email software from
http://sourceforge.net/projects/virtual-access/

Roderick Stewart

unread,
Aug 11, 2012, 6:17:08 AM8/11/12
to
In article <LYCxofGX...@bancom.co.uk>, Tony sayer wrote:
> >> After complaints from a user in my own Mum's house, I took my laptop to
> >> the seat outside the front door and found 17 networks - this in a low
> >> density area with a lot of elderly neighbours. If you let your signal
> >> level drop, it's likely to be swamped by low-level noise from other
> >> users, especially in a tenement where routers may be only a few metres
> >> apart.
> >>
> >
> >
> Try a crowded street in Cambridge. Most everyone has got one sometimes
> more even;!....

Try a crowded street anywhere. I've found it quite interesting to use a
smartphone to look at wireless signals while walking to the local shops or on
a bus ride into town. I can usually pick up at least three other signals
apart from my own inside my home, half a dozen or more outside or in any
other residential area, but as soon as I get to any shopping area there are
usually too many to count. As there aren't nearly enough channels to avoid
clashes, I can only assume something like the FM "capture effect" must
prevent co-channel interference from unwanted signals when they are only a
little bit weaker than the wanted ones, and that the situation is a little
less chaotic inside the buildings and close to the routers.

The names people choose can sometimes be quite amusing. Names of nearby shops
or restaurants are quite easy to pick out of course, but I'll sometimes see
things like "darrensnetwork" or "darrenandtracy" or "eatmyshorts". There's
one near me called "Go away", and another called "fuck off use your own".
Others are simply called by manufacturers' names, or "default", or "network"
or even "Desktop PC". About one in ten, mostly amongst the ones with the less
creative names, is not secured.

Stephen

unread,
Aug 11, 2012, 9:56:14 AM8/11/12
to
On Sat, 11 Aug 2012 11:17:08 +0100, Roderick Stewart
<rj...@escapetime.removethisbit.myzen.co.uk> wrote:

>In article <qTyiQmFd...@g3ohx.demon.co.uk>, Ian Jackson wrote:
>> There are 13 x 2.4GHz wifi channels.
>> http://en.wikipedia.org/wiki/List_of_WLAN_channels
>
>That's the real problem. It's like IPV4 or the "millennium bug", or in
>other words a lack of foresight. It's something that worked in the
>technical sense at the time it was invented, but for which nobody
>considered the implications of real people wanting to use it.

I suspect the channel numbers and band existed before WiFi - this is
the "Industrial, Scientific & Medical" or ISM unlicenced band, and is
used for other stuff as well as WiFi.

The big constraint for WiFi was finding a band which is more or less
useable + legal + not swamped with regulations and licence constraints
in nearly all countries

If the 2.4 GHz ISM is swamped, then the easy fix for more channels is
to use the 5 GHz band - 8 channels, and no overlap. (easy meaning, buy
the kit and plug it in, then make it work.......)

Some recent kit has dual band hardware as standard, such as my work
Dell laptop (not that I would recommend the Dell, although my issues
with it are not WiFi related).

5 GHz is still much quieter, but is more adsorbed by walls etc, so
lower ranges.
>
>> If your laptop has the facility, do what is (on my wife's laptop) a
>> 'site survey'. This does a spectrum analysis of the wifi signals it is
>> receiving, and it identifies them. See which channels are not being used
>> (or where the signal is weak), and if possible, change your wifi link to
>> one of those.
>
>For Windows or Linux there is a utility called "InSSIDer", and for Android
>there's one called "Wifi Analyser".
>
>Rod.
--
Regards

stephe...@xyzworld.com - replace xyz with ntl

Roderick Stewart

unread,
Aug 11, 2012, 10:30:13 AM8/11/12
to
In article <6ooc28hj3p9eo76ur...@4ax.com>, Stephen wrote:
> The big constraint for WiFi was finding a band which is more or less
> useable + legal + not swamped with regulations and licence constraints
> in nearly all countries
>
> If the 2.4 GHz ISM is swamped, then the easy fix for more channels is
> to use the 5 GHz band - 8 channels, and no overlap. (easy meaning, buy
> the kit and plug it in, then make it work.......)

Every little helps, but what's really needed to cope with current usage
is about a hundred channels that are not used for anything else, and
there is always the possibility that some as yet unforeseeable future
invention could require ten times the bandwidh we regard as normal now.
It's happened before.

The Natural Philosopher

unread,
Aug 11, 2012, 11:08:20 AM8/11/12
to
Roderick Stewart wrote:
> In article <6ooc28hj3p9eo76ur...@4ax.com>, Stephen wrote:
>> The big constraint for WiFi was finding a band which is more or less
>> useable + legal + not swamped with regulations and licence constraints
>> in nearly all countries
>>
>> If the 2.4 GHz ISM is swamped, then the easy fix for more channels is
>> to use the 5 GHz band - 8 channels, and no overlap. (easy meaning, buy
>> the kit and plug it in, then make it work.......)
>
> Every little helps, but what's really needed to cope with current usage
> is about a hundred channels that are not used for anything else,

well that's about 10GHZ of bandwith,.. where can we find that? Goolly,
I know, we could use electrical signals down CAT5 or optical signals
down a fibre!

Problem solved!


and
> there is always the possibility that some as yet unforeseeable future
> invention could require ten times the bandwidh we regard as normal now.
> It's happened before.
>

You do have a penchant for stating the bleeding obvious as if it were a
Radical Thought, dontcha?
> Rod.

Graham J

unread,
Aug 11, 2012, 4:33:06 PM8/11/12
to
Stephen wrote:
[snip]> Some recent kit has dual band hardware as standard, such as my work
> Dell laptop (not that I would recommend the Dell, although my issues
> with it are not WiFi related).
>
Just out of interest , what model of Dell laptop, and what problems?

I've had problems with a specific model and would be interested to see
if yours is the same ...

--
Graham J


Roderick Stewart

unread,
Aug 12, 2012, 3:38:44 AM8/12/12
to
In article <k05sh4$mql$2...@news.albasani.net>, The Natural Philosopher wrote:
> > there is always the possibility that some as yet unforeseeable future
> > invention could require ten times the bandwidh we regard as normal now.
> > It's happened before.
> >
>
> You do have a penchant for stating the bleeding obvious as if it were a
> Radical Thought, dontcha?

It's a pity a few more things aren't bleeding obvious when people are
inventing stuff, so that they can invent it properly instead of bodging it
or settling for inferior performance afterwards.

The Natural Philosopher

unread,
Aug 12, 2012, 4:41:31 AM8/12/12
to
Roderick Stewart wrote:
> In article <k05sh4$mql$2...@news.albasani.net>, The Natural Philosopher wrote:
>>> there is always the possibility that some as yet unforeseeable future
>>> invention could require ten times the bandwidh we regard as normal now.
>>> It's happened before.
>>>
>> You do have a penchant for stating the bleeding obvious as if it were a
>> Radical Thought, dontcha?
>
> It's a pity a few more things aren't bleeding obvious when people are
> inventing stuff, so that they can invent it properly instead of bodging it
> or settling for inferior performance afterwards.
>
> Rod.

Yep that's why smiths used to make pistons instead of horseshoes.

They anticipated the arrival of the car by 100 years.

I assume you are a clueless student with no experience of the real world?

Roderick Stewart

unread,
Aug 12, 2012, 8:14:48 AM8/12/12
to
In article <k07q7r$94m$1...@news.albasani.net>, The Natural Philosopher wrote:
> >> You do have a penchant for stating the bleeding obvious as if it were a
> >> Radical Thought, dontcha?
> >
> > It's a pity a few more things aren't bleeding obvious when people are
> > inventing stuff, so that they can invent it properly instead of bodging it
> > or settling for inferior performance afterwards.
> >
> > Rod.
>
> Yep that's why smiths used to make pistons instead of horseshoes.
>
> They anticipated the arrival of the car by 100 years.
>
> I assume you are a clueless student with no experience of the real world?

Nope. Considering the so-called "millenium bug", which was one of the examples
I gave, I'm not sufficiently clueless to be unaware that four decimal digits
are necessary for the year number not to be ambiguous, and nobody else with
responsibilities should be that clueless either. I wonder how much experience
of the real world was exercised by those who elected to use only two, or
perhaps if they thought that for some reason we would stop making computers
before the turn of the century? It has always struck me as a surprising lack of
foresight.

It's not possible to predict our future need for everything of course, but some
should be easy to estimate. The number of wireless channels needed if every
house has one and they have a certain range ought to be possible to calculate
in the same way as for broadcasting transmitters or mobile phone networks, but
we don't seem to have done this at all.

The Natural Philosopher

unread,
Aug 12, 2012, 8:21:51 AM8/12/12
to
Roderick Stewart wrote:
> In article <k07q7r$94m$1...@news.albasani.net>, The Natural Philosopher wrote:
>>>> You do have a penchant for stating the bleeding obvious as if it were a
>>>> Radical Thought, dontcha?
>>> It's a pity a few more things aren't bleeding obvious when people are
>>> inventing stuff, so that they can invent it properly instead of bodging it
>>> or settling for inferior performance afterwards.
>>>
>>> Rod.
>> Yep that's why smiths used to make pistons instead of horseshoes.
>>
>> They anticipated the arrival of the car by 100 years.
>>
>> I assume you are a clueless student with no experience of the real world?
>
> Nope. Considering the so-called "millenium bug", which was one of the examples
> I gave, I'm not sufficiently clueless to be unaware that four decimal digits
> are necessary for the year number not to be ambiguous, and nobody else with
> responsibilities should be that clueless either. I wonder how much experience
> of the real world was exercised by those who elected to use only two, or
> perhaps if they thought that for some reason we would stop making computers
> before the turn of the century? It has always struck me as a surprising lack of
> foresight.
>

There was no millenium bug.

It was a simple FUD scheme to get a lot of money for the IT industry.

I know. I was there.


> It's not possible to predict our future need for everything of course, but some
> should be easy to estimate. The number of wireless channels needed if every
> house has one and they have a certain range ought to be possible to calculate
> in the same way as for broadcasting transmitters or mobile phone networks, but
> we don't seem to have done this at all.
>

Like the old 'upgradeable' US Robotics modems of the past..that no one
ever upgraded because the new standard was different anyway, and a new
modem cost less than the upgrade.

If you had been around engineering as long as I have you would know that
whatever it was you spent time and money expecting was never the thing
that actually happened.

You should read 'the Black Swan' . Where Taleb makes the point that all
the key events in the world are not linear extrapolations of things we
know about but wild card one off discoveries or events that no one
thought would happen.

Try running a business geared towards accurate cost benefit analysis
instead of idealistic dreaming and you will see that in fact the
engineering is done exactly as it should be.



> Rod.

George Weston

unread,
Aug 12, 2012, 8:22:20 AM8/12/12
to
I see your point.
I'll go off-topic a bit more and direct your attention to Dyson vacuum
cleaners as an example.
Wonderful design - and all those patents, but...
I've gone through two of them and have resolved never to buy another one.
They both suffered from the "overheating" control malfunctioning and
cutting out completely - forever.
I then went for a Sebo - good German engineering and no fancy design
innovations.
It just works.

George

The Natural Philosopher

unread,
Aug 12, 2012, 8:34:05 AM8/12/12
to
so do I but its meaningless.

Its easy enough to caclulate the total bandwidth needed for wifi. Any
Fule Kno that.

But that's cat belling par excellence. The issue is finding a radio
spectrum big enough to put that bandwidth in, and, more - a spectrum
that isn't so short range that its really line of sight (optical
frequencies).

Which is why I directed the OP to examine the theory of data
communications.

He appears to think that the issue is a limitation in the technology: It
is not. Its a limitation in the underlying physics.

Cf Nyquist, Shannon, Turing et al.

Frankly I am appalled by the educational standards of today - even
among people who have allegedly technical qualifications, that the sense
of 'if you just spend enough money anything is possible' is all pervasive..

Much of 20th century maths engineering and physics has been spent on
establishing what you cant ever do, no matter what.

Like running Britain off 'renewables' or having gigabit speeds from WiFi
in any band sub the optical or a fixed cable.


> I'll go off-topic a bit more and direct your attention to Dyson vacuum
> cleaners as an example.
> Wonderful design - and all those patents, but...
> I've gone through two of them and have resolved never to buy another one.
> They both suffered from the "overheating" control malfunctioning and
> cutting out completely - forever.
> I then went for a Sebo - good German engineering and no fancy design
> innovations.
> It just works.
>
> George


AnthonyL

unread,
Aug 13, 2012, 7:26:57 AM8/13/12
to
On Sun, 12 Aug 2012 13:21:51 +0100, The Natural Philosopher
<t...@invalid.invalid> wrote:

>Roderick Stewart wrote:
>>
>> Nope. Considering the so-called "millenium bug", which was one of the examples
>> I gave, I'm not sufficiently clueless to be unaware that four decimal digits
>> are necessary for the year number not to be ambiguous, and nobody else with
>> responsibilities should be that clueless either. I wonder how much experience
>> of the real world was exercised by those who elected to use only two, or
>> perhaps if they thought that for some reason we would stop making computers
>> before the turn of the century? It has always struck me as a surprising lack of
>> foresight.
>>
>
>There was no millenium bug.
>
>It was a simple FUD scheme to get a lot of money for the IT industry.
>
>I know. I was there.
>

No you weren't.

Accounting and forward order processing software that we wrote when
trying to squeeze as much as possible onto floppy disks and 10mB
drives had a lot of 2 digit years. At the time it was written no-one
thought that users would still be running the software at the
millenium.

It took a lot of effort to fix it. The sort of effort that prevented
a millenium melt-down.


--
AnthonyL

The Natural Philosopher

unread,
Aug 13, 2012, 7:41:47 AM8/13/12
to
so some crappy BASIC code that was still in use 30 years after the end
of its life m running off a floppy disk was going to cause a global
meltdown was it?

Heck even THEN I had the intelligence to say 'IF YEAR <1970 THEN
YEAR=YEAR+100;

But Unix systems and mainframes by and large had time functions that
were fully Y2K compliant and thats what all big systems ran on

Martin Brown

unread,
Aug 13, 2012, 8:29:14 AM8/13/12
to
On 13/08/2012 12:41, The Natural Philosopher wrote:
> AnthonyL wrote:
>> On Sun, 12 Aug 2012 13:21:51 +0100, The Natural Philosopher
>> <t...@invalid.invalid> wrote:
>>
>>> Roderick Stewart wrote:
>>>> Nope. Considering the so-called "millenium bug", which was one of
>>>> the examples I gave, I'm not sufficiently clueless to be unaware
>>>> that four decimal digits are necessary for the year number not to be
>>>> ambiguous, and nobody else with responsibilities should be that
>>>> clueless either. I wonder how much experience of the real world was
>>>> exercised by those who elected to use only two, or perhaps if they
>>>> thought that for some reason we would stop making computers before
>>>> the turn of the century? It has always struck me as a surprising
>>>> lack of foresight.
>>>>
>>> There was no millenium bug.
>>>
>>> It was a simple FUD scheme to get a lot of money for the IT industry.
>>>
>>> I know. I was there.

There were enough things that would fail badly on Y2k rollover that it
was necessary to do remedial work on a fair number of older systems.

I was also there and working on systems that did matter. Most of the
problems on PCs and workstations were largely cosmetic and amusing.
Realtime industrial process controllers and banking applications were
another matter altogether.
>>
>> No you weren't.
>>
>> Accounting and forward order processing software that we wrote when
>> trying to squeeze as much as possible onto floppy disks and 10mB
>> drives had a lot of 2 digit years. At the time it was written no-one
>> thought that users would still be running the software at the
>> millenium.
>>
>> It took a lot of effort to fix it. The sort of effort that prevented
>> a millenium melt-down.
>>
>>
> so some crappy BASIC code that was still in use 30 years after the end
> of its life m running off a floppy disk was going to cause a global
> meltdown was it?

Quite likely irrespective of the programming language used some of it
was in industrial controllers and logic for computing bank payments as
two examples where it matters. The core banking applications are
insanely old and inclined to have maintenance traps. You will recall
there have been some pretty spectacular SNAFUs in that sector recently.
>
> Heck even THEN I had the intelligence to say 'IF YEAR <1970 THEN
> YEAR=YEAR+100;

And if you only have one byte available to store the year in what then?
>
> But Unix systems and mainframes by and large had time functions that
> were fully Y2K compliant and thats what all big systems ran on

Except for the ones that used a byte for year and prefixed it with the
fixed string "19" in front of it leading to Y2k date of 1 Jan 19100 in
plenty of mainframe based sites and some websites which ISTR included
the USNO! BCD and string thinking prevailed in some places.

There is probably some production code still lurking even now that will
die horribly when the calendar reaches 2028 (128 years since 1900) or
2098 (128 years since 1970).

And almost certainly a whole host of really serious things are still
waiting to bite us in the arse on Unix 2^32 seconds rollover day 2038.

http://en.wikipedia.org/wiki/Year_2038_problem

There was a real risk for memory constrained control systems coded by
brain dead cheap programmers which was exemplified by the shutdown of a
whole bunch of aluminium smelters around the world due to stupid leap
year related logic errors just four years prior to Y2k.

http://catless.ncl.ac.uk/Risks/18.74.html#subj5

And the corpse marking of certain bank cards used in cash machine on 1st
March or 31st December in year 2000 itself. ISTR one major UK bank
failed to notice that divides by 200 is *NOT* a leap year and the fudge
put in to get around it on the day gave a mismatched number of days in
the year. It is easy to say that it was all a fuss about nothing because
no disasters occurred, but the reason that disater was averted was
because the bulk of the serious faults were fixed in good time.

Most of the survivalist nutters were disappointed that civil society
continued and are still eating tinned beans and sausages even now!
Those expecting "The Rapture" were even more disappointed.

Regards,
Martin Brown

David Hume

unread,
Aug 13, 2012, 10:05:08 AM8/13/12
to
On 13/08/12 12:41, The Natural Philosopher wrote:
> AnthonyL wrote:
>>> I know. I was there.
>> No you weren't.
> so some crappy BASIC code that was still in use 30 years after the end
> of its life m running off a floppy disk was going to cause a global
> meltdown was it?
>
> Heck even THEN I had the intelligence to say 'IF YEAR <1970 THEN
> YEAR=YEAR+100;
>
> But Unix systems and mainframes by and large had time functions that
> were fully Y2K compliant and thats what all big systems ran on
>
>

I have to say you are really quite wrong. Banking systems written in
COBOL (for example) really needed to be fixed. It was no scam.

The Natural Philosopher

unread,
Aug 13, 2012, 10:58:13 AM8/13/12
to
It was when they extended it way beyond COBOL and BASIC to code that
didn't even know or care what year it was. sheesh.

AND that was the code maintainers reponsibility to fix - a few banks
offline for a couple off days- nothing new there, then.

Andy Burns

unread,
Aug 13, 2012, 3:34:55 PM8/13/12
to
The Natural Philosopher wrote:

> Heck even THEN I had the intelligence to say 'IF YEAR <1970 THEN
> YEAR=YEAR+100;
>
> But Unix systems and mainframes by and large had time functions that
> were fully Y2K compliant and thats what all big systems ran on

How will they fare in another 26 years?

Roderick Stewart

unread,
Aug 14, 2012, 5:50:34 AM8/14/12
to
In article <5028e2fa...@news.zen.co.uk>, AnthonyL wrote:
> Accounting and forward order processing software that we wrote when
> trying to squeeze as much as possible onto floppy disks and 10mB
> drives had a lot of 2 digit years. At the time it was written no-one
> thought that users would still be running the software at the
> millenium.

Of course the users wouldn't be using the same software, but they'd
need to be working with the same data wouldn't they? Regardless of the
software, it seems the data was recorded with insufficient resolution
to cope with a predictable event that would occur within less than a
lifetime, which seems to me a bit of a lack of foresight.

Roderick Stewart

unread,
Aug 14, 2012, 5:50:34 AM8/14/12
to
In article <k0ap5r$4dr$1...@news.albasani.net>, The Natural Philosopher
wrote:
> Heck even THEN I had the intelligence to say 'IF YEAR <1970 THEN
> YEAR=YEAR+100;

Good grief! I haven't been born yet!

Martin Brown

unread,
Aug 14, 2012, 6:35:28 AM8/14/12
to
On 14/08/2012 10:50, Roderick Stewart wrote:
> In article <5028e2fa...@news.zen.co.uk>, AnthonyL wrote:
>> Accounting and forward order processing software that we wrote when
>> trying to squeeze as much as possible onto floppy disks and 10mB
>> drives had a lot of 2 digit years. At the time it was written no-one
>> thought that users would still be running the software at the
>> millenium.
>
> Of course the users wouldn't be using the same software, but they'd
> need to be working with the same data wouldn't they? Regardless of the
> software, it seems the data was recorded with insufficient resolution
> to cope with a predictable event that would occur within less than a
> lifetime, which seems to me a bit of a lack of foresight.

It depends how the two digits were done. Too many used BCD in byte
representation which means you are stuffed after 99 rolling to 00.

Back then y2000 was a very long way away and storage was expensive so
adding a byte to every date seemed like overkill. Designs that did it
sensibly (ie not using BCD arithmetic) are good to 2028 if they used
year zero as 1900 and signed bytes or to 2156 if they used unsigned
bytes. Many used a year zero of 1950 or 1970 and will work longer.

It was mainly the systems using BCD nibbles that went haywire at Y2k.

Regards,
Martin Brown

Peter Boulding

unread,
Aug 14, 2012, 6:50:54 AM8/14/12
to
On Mon, 13 Aug 2012 15:58:13 +0100, The Natural Philosopher
<t...@invalid.invalid> wrote in <k0b4m5$rke$3...@news.albasani.net>:

>> I have to say you are really quite wrong. Banking systems written in
>> COBOL (for example) really needed to be fixed. It was no scam.
>>
>It was when they extended it way beyond COBOL and BASIC to code that
>didn't even know or care what year it was. sheesh.
>
>AND that was the code maintainers reponsibility to fix - a few banks
>offline for a couple off days- nothing new there, then.

I know it's fashionable to believe that there was no millennium bug (or that
the required code changes were tiny in comparison with the resources
expended) but that really ain't the case.

Banks, insurance companies, and other financial institutions, in particular,
were early adopters and bult up complex interdependent systems much of whose
core code was written in COBOL--or even earlier languages--that relied on
DDMMYY-style data declarations; it was easy enough to change such
declarations and associated code to use DDMMYYYY--in principle; much was
very poorly-written code, which is hard to update (and often undocumented,
which makes it a lot harder). But every interdependent program had to be
updated to match and couldn't be adequately tested until and unless it was
possible to test every scenario that made use of such dependencies; as a
result planning, administering and testing the updates proved to be a
logistical nightmare on a scale that many companies had never previously had
to face.

It's worth noting that certain recent high-profile IT fuck-ups in the
banking world result from upgrades that failed to take account of
peculiarities in core code so old that the relevant companies no longer
possess either the necessary documentation or the requisite programming
language expertise. In some cases the source code no longer exists (no,
seriously).

The one Y2K worry that proved to be unfounded concerned the possibility that
various off-the-shelf multipurpose chips used in pieces of old hardware
(mostly low volume scientific and medical devices)--chips so old that no
design documentation had been retained by their manufacturers (or
successors)--might suffer the electronic equivalent of a nervous breakdown
because of their built-in DDMMYY-style date routines, even though those
routines were not utilised by the equipment in question. I am not aware of
any such device that did in fact fail.

But while corporates may have wasted vast amounts of *management* man-hours
trying to understand and get to grips with the problem (don't they always?)
there's really no evidence to support the oft-repeated suggestion that Y2K
was a storm in a teacup or that the IT industry inflated the problem for the
purpose of lining its own nest.

The great Y2K update was in fact something of a triumph; it's a miracle that
so few systems fell over. This can be largely attributed to the fact that,
most unusually, the wretched coders knew exactly what they were supposed to
achieve.

--
Regards, Peter Boulding
pjbn...@UNSPAMpboulding.co.uk (to e-mail, remove "UNSPAM")
Fractal Images and Music: http://www.pboulding.co.uk/
http://www.soundclick.com/bands/default.cfm?bandID=794240&content=music

Dave Saville

unread,
Aug 14, 2012, 7:38:05 AM8/14/12
to
On Tue, 14 Aug 2012 10:35:28 UTC, Martin Brown
<|||newspam|||@nezumi.demon.co.uk> wrote:

<snip>
> Back then y2000 was a very long way away and storage was expensive so
> adding a byte to every date seemed like overkill. Designs that did it
> sensibly (ie not using BCD arithmetic) are good to 2028 if they used
> year zero as 1900 and signed bytes or to 2156 if they used unsigned
> bytes. Many used a year zero of 1950 or 1970 and will work longer.

People today have *no* idea how expensive RAM and disk space was at
the start of it all. When your phone has more storage than an 80's
mainframe, a USB stick more than a room full of disk drives, it's hard
to grasp or even believe. I started out in the mid 60's on an IBM
360/40, soon changed for a 44. It had real hand knitted core memory.
Which came in chunks the size of a housebrick, were 8K bytes and cost
the wrong side of a hundred grand.

If a bank had a million customers then to store four digit years
instead of two would take an extra two million bytes if each customer
only made one transaction - it soon adds up. Disk drives at the time
were measured in K and it would mean *another* room full of drives
with attendent cost, environment etc.. And the reason they were stored
as decimal digits was COBOL was the language of choice and it was easy
to define a number as "pic 9999", which made reporting easy and was
stored as character which made the poor old CPU endlessly convert back
and forth in order to do its sums. And noone but noone thought the
code would be still running in 2000. But the old code kept running and
more and more was built on top of it......................

--
Regards
Dave Saville

The Natural Philosopher

unread,
Aug 14, 2012, 8:12:42 AM8/14/12
to
the 64 bit ones will be fine :-)

AnthonyL

unread,
Aug 14, 2012, 8:17:30 AM8/14/12
to
Our code was Cobol. Natural Philosopher didn't apply for the
programming vacancy in 1983 otherwise we'd have saved ourselves a lot
of trouble.

Of course Natural Philospher's code is going to have problem in 2100.


--
AnthonyL

The Natural Philosopher

unread,
Aug 14, 2012, 8:20:15 AM8/14/12
to
well I wont be here to get blamed haha.

If you really think we will have a banking system in 2100 relaying on
130 year old code, you have been talking to Mervyn King.

Graham Murray

unread,
Aug 14, 2012, 8:41:39 AM8/14/12
to
Martin Brown <|||newspam|||@nezumi.demon.co.uk> writes:

> Back then y2000 was a very long way away and storage was expensive so
> adding a byte to every date seemed like overkill. Designs that did it
> sensibly (ie not using BCD arithmetic) are good to 2028 if they used
> year zero as 1900 and signed bytes or to 2156 if they used unsigned
> bytes. Many used a year zero of 1950 or 1970 and will work longer.

Yet financial systems would have had to cope with the date of birth of
someone born in 1915 and the expected retirement date in 2015 of a man
born in 1950. Or of the birth dates of a centenarian born in (say) 1872
and a baby born in 1972.

The Natural Philosopher

unread,
Aug 14, 2012, 9:29:42 AM8/14/12
to
Most databases have a date time type that stores DDMMYYYY and HHMM:SS so
its not hard.

Te problem was real;ly old COBOL and BASIC code. stored in presumably
flat files where two bytes for a year ion ASCII was also the allowed.

Martin Brown

unread,
Aug 14, 2012, 9:34:51 AM8/14/12
to
There is an important difference between transaction date & time stamps
and information held in the customer record. It was with transaction
dates where the problems arose and account interest computations.

It was classic Cobol thinking to use just two digits for this. Data
entry costs money so adding "19" to card punching had costs there too.

The age of the person owning the bank account does not participate in
the day to day computations beyond refusing to open accounts if age is
too low and refusing customers insurance if age is too high.

I usually argue the opposite side of this argument - that they should
have paid more attention to the date year field since many banking
mainframes of that era were using fixed length records on disk with some
of it still unused.

Take a look at the historical Y2k discussions from 1999 (including the
protracted negotiations to create a .y2k group) to see what I mean.

Regards,
Martin Brown

Dave Saville

unread,
Aug 14, 2012, 12:00:46 PM8/14/12
to
On Tue, 14 Aug 2012 13:29:42 UTC, The Natural Philosopher
<t...@invalid.invalid> wrote:

<snip>
>Most databases have a date time type that stores DDMMYYYY and HHMM:SS
so
> its not hard.

Now they do. When those programs were written they didn't - or the DB
itself didn't even exist. Hindsight is always 20/20.
--
Regards
Dave Saville

The Natural Philosopher

unread,
Aug 14, 2012, 12:54:12 PM8/14/12
to
You missed the point.

Only very early code is affected. Anything post 1980 is probably OK forever.

But remember, we were TOLD that aircraft would crash, the internet would
stop, cars wouldn't start..nuclear power stations would blow up.

Not that sage accounts 1980 in Sue Grabbit and Ruin would not get the
right accounting balance.

Alan LeHun

unread,
Aug 14, 2012, 1:17:12 PM8/14/12
to
In article <k0dvrl$65q$1...@news.albasani.net>, t...@invalid.invalid says...


> You missed the point.
>
> Only very early code is affected. Anything post 1980 is probably OK forever.

A little later than that. It was only when people started to warn of
impending disaster that the practice was changed. Microsoft access only
became fully y2k compliant with the release of Access 96, although work-
around patches were available for the previous version, Access 2


>
> But remember, we were TOLD that aircraft would crash, the internet would
> stop, cars wouldn't start..nuclear power stations would blow up.

Yes. We were told that they would if we didn't do something about it. We
did do something about it and as a result of that aircraft didn't crash,
the internet didn't stop and no power stations blew up. (Although I seem
to recall that at least one had an unnecessary automatic shutdown).

...



--
Alan LeHun

The Natural Philosopher

unread,
Aug 14, 2012, 1:52:01 PM8/14/12
to
I trawled through about a million lines of code personally, my staff
trawled through ten times that. Yes we made money, but we found nothing
at all in any product we had sold or written.

Or any product we used either.

Complete con.

Peter Boulding

unread,
Aug 14, 2012, 3:30:00 PM8/14/12
to
On Tue, 14 Aug 2012 18:52:01 +0100, The Natural Philosopher
<t...@invalid.invalid> wrote in <k0e381$da7$1...@news.albasani.net>:

>Alan LeHun wrote:
>> In article <k0dvrl$65q$1...@news.albasani.net>, t...@invalid.invalid says...
>>
>>
>>> You missed the point.
>>>
>>> Only very early code is affected. Anything post 1980 is probably OK forever.
>>
>> A little later than that. It was only when people started to warn of
>> impending disaster that the practice was changed. Microsoft access only
>> became fully y2k compliant with the release of Access 96, although work-
>> around patches were available for the previous version, Access 2
>>
>>
>>> But remember, we were TOLD that aircraft would crash, the internet would
>>> stop, cars wouldn't start..nuclear power stations would blow up.
>>
>> Yes. We were told that they would if we didn't do something about it. We
>> did do something about it and as a result of that aircraft didn't crash,
>> the internet didn't stop and no power stations blew up. (Although I seem
>> to recall that at least one had an unnecessary automatic shutdown).
>>
>> ...
>
>I trawled through about a million lines of code personally, my staff
>trawled through ten times that. Yes we made money, but we found nothing
>at all in any product we had sold or written.
>
>Or any product we used either.
>
>Complete con.

Well, bully for you and your staff, who evidently were either uninvolved in
date processing or--more likely--had long been in the excellent habit of
using four digit codes for all recorded year values and in date processing.

But you are wrong to assume that everyone--or even most of those who had
been in the DP game since the early eighties--had such foresight. One local
authority for which I used to work approached Y2K with a pile of two-digit
date processing software still in use... some of which was written in COBOL,
some in RPG2, and some in a proprietary ICL online transaction processing
system, and all of which were interdependent.
Message has been deleted

Roderick Stewart

unread,
Aug 15, 2012, 4:41:19 AM8/15/12
to
In article <gj5m281lkg034u0j3...@4ax.com>, Phil W Lee
wrote: [re the millennium bug]
> Which is at least to some extent what the coders were saying in 1970.
> Plus, of course, in those days ram was horribly expensive, and
> programmers earned their money by saving a byte here and a byte there,
> and the project lifespan was supposed to be ten years - so why would
> you need the century anyway?
> >
> >If you really think we will have a banking system in 2100 relaying on
> >130 year old code, you have been talking to Mervyn King.

It's not the *code* that should have been allowed for, but the continuing
existence of the *data* that it was handling. Regardless of what
administrative machinery we'll be using in the future, data relating to
health checks, payments, legal entitlements, routine maintenance and so
on, will still relate to the same people and things, or their
descendants. It doesn't matter what software code we use, or what
computers, or what kind of computers, or if we use computers at all,
abbreviating records to two decimal digits for the year number will make
the centuries ambiguous.

The "millennium bug" was the result of recording data with inadequate
precision that related to things that would continue to exist long after
a predictable event that would make the data ambiguous, and that was less
than a lifetime away.

Rod.
--

The Natural Philosopher

unread,
Aug 15, 2012, 5:23:22 AM8/15/12
to
Answer is simple. If data source is to 2 decimal precision it refers to
the 1900s. If its 4 decimal precision its todays date.

Alternatively send it all to te University of East Anglia climate unit
where they will ignore half of it and lose the rest, after having
mangled it beyond recognition into a hockey stick*, so you don't need to
worry any more.

:-)

* I believe the killer comment was "we used a random number generator to
feed their data processing filter, and do a Monte Carlo analysis. With
pure random data it still produced a hockey stick"
> Rod.
> --

Andy Champ

unread,
Aug 15, 2012, 3:50:08 PM8/15/12
to
On 15/08/2012 09:41, Roderick Stewart wrote:
> The "millennium bug" was the result of recording data with inadequate
> precision that related to things that would continue to exist long after
> a predictable event that would make the data ambiguous, and that was less
> than a lifetime away.

The millennium "bug" was the result of people back in the 60s and 70s
when two characters containing 19 in every date field would have cost an
arm and a leg sensibly optimising it out. And leaving the problem to
where it was far cheaper to sort out.

Andy

Graham.

unread,
Aug 16, 2012, 12:46:38 PM8/16/12
to
On Sat, 11 Aug 2012 11:17:08 +0100, Roderick Stewart
<rj...@escapetime.removethisbit.myzen.co.uk> wrote:

>In article <LYCxofGX...@bancom.co.uk>, Tony sayer wrote:
>> >> After complaints from a user in my own Mum's house, I took my laptop to
>> >> the seat outside the front door and found 17 networks - this in a low
>> >> density area with a lot of elderly neighbours. If you let your signal
>> >> level drop, it's likely to be swamped by low-level noise from other
>> >> users, especially in a tenement where routers may be only a few metres
>> >> apart.
>> >>
>> >
>> >
>> Try a crowded street in Cambridge. Most everyone has got one sometimes
>> more even;!....
>
>Try a crowded street anywhere. I've found it quite interesting to use a
>smartphone to look at wireless signals while walking to the local shops or on
>a bus ride into town. I can usually pick up at least three other signals
>apart from my own inside my home, half a dozen or more outside or in any
>other residential area, but as soon as I get to any shopping area there are
>usually too many to count. As there aren't nearly enough channels to avoid
>clashes, I can only assume something like the FM "capture effect" must
>prevent co-channel interference from unwanted signals when they are only a
>little bit weaker than the wanted ones, and that the situation is a little
>less chaotic inside the buildings and close to the routers.
>
>The names people choose can sometimes be quite amusing. Names of nearby shops
>or restaurants are quite easy to pick out of course, but I'll sometimes see
>things like "darrensnetwork" or "darrenandtracy" or "eatmyshorts". There's
>one near me called "Go away", and another called "fuck off use your own".
>Others are simply called by manufacturers' names, or "default", or "network"
>or even "Desktop PC". About one in ten, mostly amongst the ones with the less
>creative names, is not secured.
>
>Rod.

One of mine is called CISCO, but it's actually a humble Netgear.

--
Graham.
%Profound_observation%

allegoricus

unread,
Aug 28, 2012, 11:38:02 AM8/28/12
to
On Sat, 11 Aug 2012 11:17:08 +0100, Roderick Stewart
<rj...@escapetime.removethisbit.myzen.co.uk> wrote:

------------8><
>The names people choose can sometimes be quite amusing. Names of nearby shops
>or restaurants are quite easy to pick out of course, but I'll sometimes see
>things like "darrensnetwork" or "darrenandtracy" or "eatmyshorts". There's
>one near me called "Go away", and another called "fuck off use your own".
>Others are simply called by manufacturers' names, or "default", or "network"
>or even "Desktop PC". About one in ten, mostly amongst the ones with the less
>creative names, is not secured.

The most creatively named that I've read about was:
wecanhearyouhavingsex
0 new messages