Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

More of my philosophy of happiness of USA and happiness of France and more of my thoughts..

1 view
Skip to first unread message

Amine Moulay Ramdane

unread,
Oct 25, 2022, 7:58:07 PM10/25/22
to






Hello,




More of my philosophy of happiness of USA and happiness of France and more of my thoughts..

I am a white arab from Morocco, and i think i am smart since i have also
invented many scalable algorithms and algorithms..


I think i am highly smart since I have passed two certified IQ tests and i have scored "above" 115 IQ, i will ask a philosophical question of:

Why french people are less happy than USA people ?

Since notice how the following web page is explaining that french people
are less happy than USA people, read more here in french:

https://www.les4verites.com/economie-4v/letat-providence-nirvana-philosophique-de-luc-ferry


I think i am smart, and i say that France has tried to learn critical thinking to french people, but they have not learned them correctly how to be decency, since France lacks Christian values, so it is why it is
lacking on decency, and since it is lacking on decency , so this lacking
on decency makes the french individual less confident towards french people, so it makes the french people much more individualistic , so it makes the french people much less happy, but in USA there is still the
"In God we trust" and there is still Christian values that brings decency, and this bringing decency makes the individual in USA more confident towards people of USA, and this more confident makes the individual in USA less individualistic and it makes the individual in USA more happy, and it is why i am saying the following:

From where comes happiness ?


But i think i am smart, and i have to discover the pattern that answers this question, and of course i have not to make it complex , so i
have to talk about the most important thing that has a great weight of
importance, so i think that happiness comes from decency in the relationships between people, so i think that this decency is like
the soul in form of a positive energy and it is what gives happiness, and of course i am not against the being rich, since you can be rich and be decency, but notice with me that i can not say that it is the being rich that makes you happy, since i think i am smart and i say that our world is an interconnected world of humans, so happiness is a systemic thing, but i think that decency is a so important factor that has a great weight of importance. And of course i can say that decency is like a good teacher that shows the good way of doing. Also of course i am talking about how to become happy in my new philosophy that i have invented, here it is:

https://groups.google.com/g/alt.culture.morocco/c/WDPcc45utLQ

More of my philosophy about happiness and addiction and more of my thoughts..

I think i am highly smart since I have passed two certified IQ tests and i have scored "above" 115 IQ, i have just looked at a video of the french Luc Ferry philosopher about consumerism in capitalism and happiness and i invite you to look at it:

Conférence Luc Ferry : Les paradoxes de l’économie du bonheur

https://www.youtube.com/watch?v=ywvP4JSZbg8


And i think the french philosopher Luc Ferry on the above video is making a mistake, since he is saying that the logic of consumerism in capitalism is identical to the logic of addiction , so he is like saying that our kind of consumerism in capitalism creates addiction, but i think it is not correct to say so, since addiction is now understood to be a brain disease. Whether it’s alcohol, prescription pain pills, nicotine, gambling, or something else, overcoming an addiction isn’t as simple as just stopping or exercising greater control over impulses, so i invite you to read the following web page from Yale Medecine so that to understand it:


https://www.yalemedicine.org/news/how-an-addicted-brain-works


More of my philosophy of from where comes happiness and more of thoughts..

I think i am highly smart since I have passed two certified IQ tests and i have scored "above" 115 IQ, i will ask a philosophical question of:

From where comes happiness ?


But i think i am smart, and i have to discover the pattern that answers this question, and of course i have not to make it complex , so i
have to talk about the most important thing that has a great weight of
importance, so i think that happiness comes from decency in the relationships between people, so i think that this decency is like
the soul in form of a positive energy and it is what gives happiness, and of course i am not against the being rich, since you can be rich and be decency, but notice with me that i can not say that it is the being rich that makes you happy, since i think i am smart and i say that our world is an interconnected world of humans, so happiness is a systemic thing, but i think that decency is a so important factor that has a great weight of importance. And of course i can say that decency is like a good teacher that shows the good way of doing. Also of course i am talking about how to become happy in my new philosophy that i have invented, here it is:

https://groups.google.com/g/alt.culture.morocco/c/WDPcc45utLQ


More of my philosophy about the reliability of the system and about Liberty and more of my thoughts..

I think i am highly smart since I have passed two certified IQ tests and i have scored "above" 115 IQ, i will ask a philosophical question:

How as a philosopher do i view Liberty ?


So i think i am a smart philosopher, so i think that the defect of white supremacism or neo-nazism or such ideologies is that when you are a
good philosopher you will understand that there is the way of politics that you are noticing, like the wanting to be a kind of Liberty and the wanting to be a kind of human diversity etc. but when you are a good philosopher you will notice that the problematic is that you have to look at how reliable are the fondations of the philosophy , and it is the way of the being smart to look at it carefully, so i think that i am a smart philosopher and i say that the problematic is that we are like requiring a too much difficult and too much complex way of doing
from the people so that to make the fondations of the philosophy
reliable, but i think that it is a weakness, since i think that you have to first make it simple and efficient rules that we understand
or believe so that to make the philosophy reliable, and i think that it is my way of doing by inventing two religions, since from the believing
i want to make people follow simple and efficient rules so that to make the philosophy
and people reliable.


More of my philosophy about the nature of God and more of my thoughts..


I think i am highly smart since I have passed two certified IQ tests and i have scored "above" 115 IQ, so i have invented two religions, read about them below, and i think that the most probable in a monotheistic religion is that the nature of God is that he is greatly arrogant and
even if he has cursed Adam and Eve and there descendants , God has also made compassion by bringing to us Jesus Christ , and Jesus Christ is the savior, and it is what said my new monotheistic religion, and of course my new monotheistic religion says that God has not created the universe and he has not created humans, but he has created Adam and Eve that looks like humans and he has cursed Adam and Eve and there descendants. And you can read below about my new monotheistic religion
so that to understand it much more deeply:


More of my philosophy about my contributions and more of my thoughts..


I think i am highly smart since I have passed two certified IQ tests and i have scored "above" 115 IQ, and now you are knowing more about my contributions here, for example i have invented a new philosophy and many proverbs and many poems of Love and poems, you can take a look at them in the following web link:

https://groups.google.com/g/alt.culture.morocco/c/WDPcc45utLQ


Also i have just invented quickly another new religion, so i have invented two religions, one not monotheistic religion and another monotheistic religion, so here is my just new not monotheistic religion, you can read about it in the following web link:

https://groups.google.com/g/alt.culture.morocco/c/Ad2tRwDxdjA


And here is my other monotheistic religion that i have invented(click
on the "Amine Moulay Ramdane" post so that to read it):

https://groups.google.com/g/soc.culture.usa/c/vKBlK160YR0


And of course you can read my thoughts about technology in the following web link:

https://groups.google.com/g/soc.culture.usa/c/N_UxX3OECX4

Also you can read my new writing about new interesting medical treatments and drugs and about antibiotic resistance here:

https://groups.google.com/g/alt.culture.morocco/c/vChmXT_pXUI


And of course you can read my just new two poems below and you can
read my other interesting thoughts below:


Here is my other new poem and read my below thoughts:


Am i a man or am i a woman ?

But i know that i am not an afghan

But i know that i am not the ape-man

But i know that i am not a caveman

But i want like to be the good and sophisticated plan
'
And i don't want to be the old like fortran

So even if i don't come from japan

I think i know how to play it like a good jazzman

And i want to be like the sophisticated repairman

Since i have to know how to make afghanistan like a sophisticated japan

And as you notice i am not at all the ku klux klan

And be sure that i have no secret plan

But you are clearly noticing that i want to be the good businessman

And that i am not the madman

So am i a man or am i a woman ?

Since notice that i want to be like a beautiful batman



Thank you,
Amine Moulay Ramdane.

--


Here is my new poem and read all my thoughts below:



I am not sad to say

That i am on my way

And it's like always my beautiful day

And since i am like walking all along the beautiful Broadway

Or it is like my visit to the beautiful Bombay

Since my Love for human life doesn't want to decay

So i want to display my beautiful portray

Of writing to you my poems of Love like a beautiful essay

And I am not sad to say

That i am on my way

Since it's like always my beautiful school day

And even my new religion above is like a saints' day

And my new philosophy above is like a christmas day

And my beautiful thoughts below are like my DNA

So as you see that it is not like the God's Judgment day

But i am showing you my wisdom right away

So as you notice it is like taking the good vitamins such as the vitamin A

And don't forget that my poems of Love below are like my valentine's day

So I am not sad to say

That i am on my beautiful way




Thank you,
Amine Moulay Ramdane.


---


More of my philosophy about the good leadership and more of my thoughts..

I am a white arab from Morocco, and i think i am smart since i have also
invented many scalable algorithms and algorithms..


I think i am highly smart since I have passed two certified IQ tests and i have scored "above" 115 IQ, so i will ask a philosophical question of:

What is the good leadership ?


So if you are smart you will first discover a pattern that is related to
what is not a good leadership, and it is the following:

A high IQ person can be not the being wise and can even be mentally handicaped , so the good leadership is not just about saying that you have to be of a high IQ, so being wise is so important characteristics for being a good leader, so what is it the being wise ? i think that it is a good balance that brings the good judgment, so being wise is also being the intellectual good precision and this intellectual good precision is also a genetical characteristic but it is not just having a high IQ, and being wise is also knowing how to behave correctly, and the good behavior is also genetical characteristics and it is good for relationships and it is good for diplomacy, so as you notice that the being a wise man is not just having a high IQ, since it is also the having a high IQ since it permits to have a good judgment, but it is also the other characteristics that i am talking about above etc.


More of my philosophy about my technicality and about my new philosophy and more of my thoughts..


Also i am also technicality, and think i am a more serious software developer, and so that to know me more, here is one of my open source software project that i have invented so that you notice it:

More of my philosophy about my scalable algorithms of my Parallel C++ Conjugate Gradient Linear System Solver Library that scales very well and more of my thoughts..

I think i am highly smart since I have passed two certified IQ tests and i have scored above 115 IQ, and I have just written the following:
---

More of my philosophy about the new Zen 4 AMD Ryzen™ 9 7950X and more of my thoughts..


So i have just looked at the new Zen 4 AMD Ryzen™ 9 7950X CPU, and i invite you to look at it here:

https://www.amd.com/en/products/cpu/amd-ryzen-9-7950x

But notice carefully that the problem is with the number of supported memory channels, since it just support two memory channels, so it is not good, since for example my following Open source software project of Parallel C++ Conjugate Gradient Linear System Solver Library that scales very well is scaling around 8X on my 16 cores Intel Xeon with 2 NUMA nodes and with 8 memory channels, but it will not scale correctly on the
new Zen 4 AMD Ryzen™ 9 7950X CPU with just 2 memory channels since it is also memory-bound, and here is my Powerful Open source software project of Parallel C++ Conjugate Gradient Linear System Solver Library that scales very well and i invite you to take carefully a look at it:

https://sites.google.com/site/scalable68/scalable-parallel-c-conjugate-gradient-linear-system-solver-library

So i advice you to buy an an AMD Epyc CPU or an Intel Xeon CPU that supports 8 memory channels.

---


And of course you can use the next Twelve DDR5 Memory Channels for Zen 4 AMD EPYC CPUs so that to scalable more my above algorithm, and read about it here:

https://www.tomshardware.com/news/amd-confirms-12-ddr5-memory-channels-on-genoa


And here is the simulation program that uses the probabilistic mechanism that i have talked about and that prove to you that my algorithm of my Parallel C++ Conjugate Gradient Linear System Solver Library is scalable:

If you look at my scalable parallel algorithm, it is dividing the each array of the matrix by 250 elements, and if you look carefully i am using two functions that consumes the greater part of all the CPU, it is the atsub() and asub(), and inside those functions i am using a probabilistic mechanism so that to render my algorithm scalable on NUMA architecture , and it also make it scale on the memory channels, what i am doing is scrambling the array parts using a probabilistic function and what i have noticed that this probabilistic mechanism is very efficient, to prove to you what i am saying , please look at the following simulation that i have done using a variable that contains the number of NUMA nodes, and what i have noticed that my simulation is giving almost a perfect scalability on NUMA architecture, for example let us give to the "NUMA_nodes" variable a value of 4, and to our array a value of 250, the simulation bellow will give a number of contention points of a quarter of the array, so if i am using 16 cores , in the worst case it will scale 4X throughput on NUMA architecture, because since i am using an array of 250 and there is a quarter of the array of contention points , so from the Amdahl's law this will give a scalability of almost 4X throughput on four NUMA nodes, and this will give almost a perfect scalability on more and more NUMA nodes, so my parallel algorithm is scalable on NUMA architecture and it also scale well on the memory channels,

Here is the simulation that i have done, please run it and you will notice yourself that my parallel algorithm is scalable on NUMA architecture.

Here it is:

---
program test;

uses math;

var tab,tab1,tab2,tab3:array of integer;
a,n1,k,i,n2,tmp,j,numa_nodes:integer;
begin

a:=250;
Numa_nodes:=4;

setlength(tab2,a);

for i:=0 to a-1
do
begin

tab2:=i mod numa_nodes;

end;

setlength(tab,a);

randomize;

for k:=0 to a-1
do tab:=k;

n2:=a-1;

for k:=0 to a-1
do
begin
n1:=random(n2);
tmp:=tab;
tab:=tab[n1];
tab[n1]:=tmp;
end;

setlength(tab1,a);

randomize;

for k:=0 to a-1
do tab1:=k;

n2:=a-1;

for k:=0 to a-1
do
begin
n1:=random(n2);
tmp:=tab1;
tab1:=tab1[n1];
tab1[n1]:=tmp;
end;

for i:=0 to a-1
do
if tab2[tab]=tab2[tab1] then
begin
inc(j);
writeln('A contention at: ',i);

end;

writeln('Number of contention points: ',j);
setlength(tab,0);
setlength(tab1,0);
setlength(tab2,0);
end.
---



And i invite you to read my thoughts about technology here:

https://groups.google.com/g/soc.culture.usa/c/N_UxX3OECX4

More of my philosophy about the problem with capacity planning of a website and more of my thoughts..

I think i am highly smart since I have passed two certified IQ tests and i have scored above 115 IQ, and i have just invented a new methodology
that simplifies a lot capacity planning of a website that can be of a
three-tier architecture with the web servers and with the applications servers and with the database servers, but i have to explain more so that you understand the big problem with capacity planning of a website, so when you want to for example to use web testing, the problem is
how to choose for example the correct distribution of the read and write and delete transactions on the database of a website ? so if it is not
realistic you can go beyond the knee of the curve and get a not acceptable waiting time, and the Mean value analysis (MVA) algorithm has
the same problem, so how to solve the problem ? so as you are noticing
it is why i have come with my new methodology that uses mathematics that
solves the problem. And read my previous thoughts:


More of my philosophy about website capacity planning and about Quality of service and more of my thoughts..

I think i am highly smart since I have passed two certified IQ tests and i have scored above 115 IQ, so i think that you have to lower to a certain level the QoS (quality of service) of a website, since you have to fix the limit of the number of connections that we allow to the website so that to not go beyond the knee of the curve, and of course i will soon show you my mathematical calculations of my new methodology of how to do capacity planning of a website, and of course
you have to know that that we have to do capacity planning using
mathematics so that to know the average waiting time etc. and this
permits us to calculate the number of connections that we allow to the website.

More of my philosophy about the Mean value analysis (MVA) algorithm and more of my thoughts..


I think i am highly smart since I have passed two certified IQ tests and i have scored above 115 IQ, and i have just read the following paper
about the Mean value analysis (MVA) algorithm, and i invite you to read it carefully:

https://www.cs.ucr.edu/~mart/204/MVA.pdf


But i say that i am understanding easily the above paper of Mean value analysis (MVA) algorithm, but i say that the above paper doesn't say that since you have to empirically collect the visit ratio and and the average demand of each class, so it is not so practical, since i say that you can and you have for example to calculate the "tendency" by also for example rendering the not memoryless service of for example the database to a memoryless service, but don't worry since i will soon make you understand my powerful methodology with all the mathematical calculations that easy for you the job and that makes it much more practical.

More of my philosophy about formal methods and about Leslie Lamport and more of my thoughts..

I think i am highly smart since I have passed two certified IQ tests and i have scored "above" 115 IQ, and I have just looked at the following video about the man who revolutionized computer science with math, and i invite you to look at it:

https://www.youtube.com/watch?v=rkZzg7Vowao

So i say that in mathematics, a conjecture is a conclusion or a proposition that is proffered on a tentative basis without proof. And Leslie Lamport the known scientist is saying in the above video the following: "An algorithm without a proof is conjecture, and if you are proving things, that means using mathematics.", so then i think that Leslie Lamport the known scientist is not thinking correctly by saying so, since i think that you can also prove an algorithm by highering much more the probability of the success of the proof without using mathematics to prove the algorithm, and i say that a proof has not to be just a conclusion as a boolean logic of true or false, since i think that a proof can be a conclusion in fuzzy logic and by logical analogy it looks like how race detectors in the very agressive mode don't detect all the data races, so then they miss a really small number of real races , so it is like a very high probability of really detecting real races, so read my below thoughts about it so that yo understand my views. And i think that the second mistake of Leslie Lamport the known scientist is that he is wanting us to use formal methods, but read the following interesting article below about why don't people use formal methods:

And I invite you to read the following new article of the known computer expert in the above video called Leslie Lamport , and that says programmers need to use math by using formal methods, and how Lamport discuss some of his work, such as the TLA+ specification language (developed by Lamport over the past few decades, the TLA+ [Temporal Logic of Actions] specification language allows engineers to describe objectives of a program in a precise and mathematical way), and also cited some of the reasons why he gives a prominent place to mathematics in programming.

Read more in the following article and you have to translate it from french to english:

https://www.developpez.com/actu/333640/Un-expert-en-informatique-declare-que-les-programmeurs-ont-besoin-de-plus-de-mathematiques-ajoutant-que-les-ecoles-devraient-repenser-la-facon-dont-elles-enseignent-l-informatique/

But to answer the above expert called Leslie Lamport, i invite you to carefully read in the following interesting web page about the why don't people use formal methods:

WHY DON'T PEOPLE USE FORMAL METHODS?

https://www.hillelwayne.com/post/why-dont-people-use-formal-methods/


More of my philosophy of the polynomial-time complexity of race detection and more of my thoughts..

I think i am highly smart since I have passed two certified IQ tests and i have scored "above" 115 IQ, so i have quickly understood how Rust
detects race conditions, but i think that a slew of
“partial order”-based methods have been proposed, whose
goal is to predict data races in polynomial time, but at the
cost of being incomplete and failing to detect data races in
"some" traces. These include algorithms based on the classical
happens-before partial order, and those based
on newer partial orders that improve the prediction of data
races over happens-before , so i think that we have to be optimistic
since read the following web page about the Sanitizers:

https://github.com/google/sanitizers

And notice carefully the ThreadSanitizer, so read carefully
the following paper about ThreadSanitizer:

https://static.googleusercontent.com/media/research.google.com/en//pubs/archive/35604.pdf


And it says in the conclusion the following:

"ThreadSanitizer uses a new algorithm; it has several modes of operation, ranging from the most conservative mode (which has few false positives but also misses real races) to a very aggressive one (which
has more false positives but detects the largest number of
real races)."

So as you are noticing since the very agressive mode doesn't detect
all the data races, so it misses a really small number of real races , so it is like a very high probability of really detecting real races ,
and i think that you can also use my below methodology of using incrementally a model from the source code and using Spin model checker so that to higher even more the probability of detecting real races.


Read my previous thoughts:

More of my philosophy about race conditions and about composability and more of my thoughts..

I say that a model is a representation of something. It captures not all attributes of the represented thing, but rather only those seeming relevant. So my way of doing in software development in Delphi and Freepascal is also that i am using a "model" from the source code that i am executing in Spin model checking so that to detect race conditions, so i invite you to take a look at the following new tutorial that uses the powerful Spin tool:

https://mirrors.edge.kernel.org/pub/linux/kernel/people/paulmck/perfbook/perfbook.html

So you can for example install Spin model checker so that to detect race conditions, this is how you will get much more professional at detecting deadlocks and race conditions in parallel programming. And i invite you to look at the following video so that to know how to install Spin model checker on windows:

https://www.youtube.com/watch?v=MGzmtWi4Oq0

More of my philosophy about race detection and concurrency and more..

I have just looked quickly at different race detectors, and i think that
the Intel Thread Checker from Intel company from "USA" is also very good since the Intel Thread Checker needs to instrument either the C++ source code or the compiled binary to make every memory reference and every standard Win32 synchronization primitive observable, so this instrumentation from the source code is very good since it also permits me to port my scalable algorithms inventions by for example wrapping them in some native Windows synchronization APIs, and this instrumentation from the source code is also business friendly, so read about different race detectors and about Intel Thread Checker here:

https://docs.microsoft.com/en-us/archive/msdn-magazine/2008/june/tools-and-techniques-to-identify-concurrency-issues

So i think that the other race detectors of other programming languages have to provide this instrumentation from the source code as Intel Thread Checker from Intel company from "USA".

More of my philosophy about Rust and about memory models and about technology and more of my thoughts..


I think i am highly smart, and i say that the new programming language that we call Rust has an important problem, since read the following interesting article that says that atomic operations that have not correct memory ordering can still cause race conditions in safe code, this is why the suggestion made by the researchers is:

"Race detection techniques are needed for Rust, and they should focus on unsafe code and atomic operations in safe code."


Read more here:

https://www.i-programmer.info/news/98-languages/12552-is-rust-really-safe.html


More of my philosophy about programming languages about lock-based systems and more..

I think we have to be optimistic about lock-based systems, since race conditions detection can be done in polynomial-time, and you can notice it by reading the following paper:

https://arxiv.org/pdf/1901.08857.pdf

Or by reading the following paper:

https://books.google.ca/books?id=f5BXl6nRgAkC&pg=PA421&lpg=PA421&dq=race+condition+detection+and+polynomial+complexity&source=bl&ots=IvxkORGkQ9&sig=ACfU3U2x0fDnNLHP1Cjk5bD_fdJkmjZQsQ&hl=en&sa=X&ved=2ahUKEwjKoNvg0MP0AhWioXIEHRQsDJc4ChDoAXoECAwQAw#v=onepage&q=race%20condition%20detection%20and%20polynomial%20complexity&f=false

So i think we can continu to program in lock-based systems, and about
composability of lock-based systems, read my following thoughts about it it:

More of my philosophy about composability and about Haskell functional language and more..

I have just read quickly the following article about composability,
so i invite you to read it carefully:

https://bartoszmilewski.com/2014/06/09/the-functional-revolution-in-c/

I am not in accordance with the above article, and i think that the above scientist is programming in Haskell functional language and it is for him the way to composability, since he says that the way of functional programming like Haskell functional programming is the
the way that allows composability in presence of concurrency, but for him lock-based systems don't allow it, but i don't agree with him, and i will give you the logical proof of it, and here it is, read what is saying an article from ACM that was written by both Bryan M. Cantrill and Jeff Bonwick from Sun Microsystems:

You can read about Bryan M. Cantrill here:

https://en.wikipedia.org/wiki/Bryan_Cantrill

And you can read about Jeff Bonwick here:

https://en.wikipedia.org/wiki/Jeff_Bonwick

And here is what says the article about composability in the presence of concurrency of lock-based systems:

"Design your systems to be composable. Among the more galling claims of the detractors of lock-based systems is the notion that they are somehow uncomposable:

“Locks and condition variables do not support modular programming,” reads one typically brazen claim, “building large programs by gluing together smaller programs[:] locks make this impossible.”9 The claim, of course, is incorrect. For evidence one need only point at the composition of lock-based systems such as databases and operating systems into larger systems that remain entirely unaware of lower-level locking.

There are two ways to make lock-based systems completely composable, and each has its own place. First (and most obviously), one can make locking entirely internal to the subsystem. For example, in concurrent operating systems, control never returns to user level with in-kernel locks held; the locks used to implement the system itself are entirely behind the system call interface that constitutes the interface to the system. More generally, this model can work whenever a crisp interface exists between software components: as long as control flow is never returned to the caller with locks held, the subsystem will remain composable.

Second (and perhaps counterintuitively), one can achieve concurrency and
composability by having no locks whatsoever. In this case, there must be
no global subsystem state—subsystem state must be captured in per-instance state, and it must be up to consumers of the subsystem to assure that they do not access their instance in parallel. By leaving locking up to the client of the subsystem, the subsystem itself can be used concurrently by different subsystems and in different contexts. A concrete example of this is the AVL tree implementation used extensively in the Solaris kernel. As with any balanced binary tree, the implementation is sufficiently complex to merit componentization, but by not having any global state, the implementation may be used concurrently by disjoint subsystems—the only constraint is that manipulation of a single AVL tree instance must be serialized."

Read more here:

https://queue.acm.org/detail.cfm?id=1454462

More of my philosophy about HP and about the Tandem team and more of my thoughts..


I invite you to read the following interesting article so that
to notice how HP was smart by also acquiring Tandem Computers, Inc.
with there "NonStop" systems and by learning from the Tandem team
that has also Extended HP NonStop to x86 Server Platform, you can read about it in my below writing and you can read about Tandem Computers here: https://en.wikipedia.org/wiki/Tandem_Computers , so notice that Tandem Computers, Inc. was the dominant manufacturer of fault-tolerant computer systems for ATM networks, banks, stock exchanges, telephone switching centers, and other similar commercial transaction processing applications requiring maximum uptime and zero data loss:

https://www.zdnet.com/article/tandem-returns-to-its-hp-roots/

More of my philosophy about HP "NonStop" to x86 Server Platform fault-tolerant computer systems and more..

Now HP to Extend HP NonStop to x86 Server Platform

HP announced in 2013 plans to extend its mission-critical HP NonStop technology to x86 server architecture, providing the 24/7 availability required in an always-on, globally connected world, and increasing customer choice.

Read the following to notice it:

https://www8.hp.com/us/en/hp-news/press-release.html?id=1519347#.YHSXT-hKiM8

And today HP provides HP NonStop to x86 Server Platform, and here is
an example, read here:

https://www.hpe.com/ca/en/pdfViewer.html?docId=4aa5-7443&parentPage=/ca/en/products/servers/mission-critical-servers/integrity-nonstop-systems&resourceTitle=HPE+NonStop+X+NS7+%E2%80%93+Redefining+continuous+availability+and+scalability+for+x86+data+sheet

So i think programming the HP NonStop for x86 is now compatible with x86 programming.

And i invite you to read my thoughts about technology here:

https://groups.google.com/g/soc.culture.usa/c/N_UxX3OECX4


More of my philosophy about stack allocation and more of my thoughts..


I think i am highly smart since I have passed two certified IQ tests and i have scored "above" 115 IQ, so i have just looked at the x64 assembler
of the C/C++ _alloca function that allocates size bytes of space from the Stack and it uses x64 assembler instructions to move RSP register and i think that it also aligns the address and it ensures that it doesn't go beyond the stack limit etc., and i have quickly understood the x64 assembler of it, and i invite you to look at it here:

64-bit _alloca. How to use from FPC and Delphi?

https://www.atelierweb.com/64-bit-_alloca-how-to-use-from-delphi/


But i think i am smart and i say that the benefit of using a stack comes mostly from "reusability" of the stack, i mean it is done this way
since you have for example from a thread to execute other functions or procedures and to exit from those functions of procedures and this exiting from those functions or procedures makes the memory of stack available again for "reusability", and it is why i think that using a dynamic allocated array as a stack is also useful since it also offers those benefits of reusability of the stack and i think that dynamic allocation of the array will not be expensive, so it is why i think i will implement _alloca function using a dynamic allocated array and i think it will also be good for my sophisticated coroutines library that you can read about it from my following thoughts about preemptive and non-preemptive timesharing in the following web link:


https://groups.google.com/g/alt.culture.morocco/c/JuC4jar661w


And i invite you to read my thoughts about technology here:

https://groups.google.com/g/soc.culture.usa/c/N_UxX3OECX4


More of my philosophy about the German model and about quality and more of my thoughts..

I think i am highly smart since I have passed two certified IQ tests and i have scored above 115 IQ, so i will ask the following philosophical question of:


Why is Germany so successful in spite of least working hours?


So i think one of the most important factors are:


Of course the first factor is that Germany has good schools and vocational training - for everyone. This makes the average worker much more productive in terms of value add per hour.

And the second "really" important factor is the following:

It’s in the culture of Germany to focus on quality and being effective (all the way back to Martin Luther and his protestant work ethic)... Higher quality in every step of the chain leads to a massive reduction in defects and rework. This increases everyone’s productivity. But notice that i am also speaking in my below thoughts about the other ways to increase productivity by being specialization etc., and the way of the German model to focus on quality and being effective by also focusing on quality in every step of the chain that leads to a massive reduction in defects and rework, is also done by the following methodologies of quality control and Six Sigma etc., so read my following thoughts about them:

More of my philosophy about quality control and more of my thoughts..

I have just looked and understood quickly the following paper about SPC(Statistical process control):

https://owic.oregonstate.edu/sites/default/files/pubs/EM8733.pdf


I think i am highly smart, but i think that the above paper doesn't speak about the fact that you can apply the central limit theorem as following:

The central limit theorem states that the sampling distribution of the mean of any independent, random variable will be normal or nearly normal, if the sample size is large enough.

Also the above paper doesn't speak about the following very important things:

And I have quickly understood quality control with SPC(Statistical process control) and i have just discovered a smart pattern with my fluid intelligence and it is that with SPC(Statistical process control) we can debug the process, like in software programming, by looking at its variability, so if the variability doesn't follow a normal distribution, so it means that there are defects in the process, and we say that there is special causes that causes those defects, and if the variability follows a normal distribution, we say that the process is stable and it has only common causes, and it means that we can control it much more easily by looking at the control charts that permit to debug and control the variability by for example changing the machines or robots and looking at the control charts and measuring again with the control charts

More of my philosophy about the Post Graduate Program on lean Six Sigma and more..

More of my philosophy about Six Sigma and more..

I think i am smart, and now i will talk more about Six Sigma
since i have just talked about SPC(Statistical quality control), so
you have to know that Six Sigma needs to fulfill the following steps:

1- Define the project goals and customer (external and internal)
deliverables.

2- Control future performance so improved process doesn't degrade.

3- Measure the process so that to determine current performance and
quantify the problem.

4- Analyze and determine the root cause(s) of the defects.

5- Improve the process by eliminating the defects.


And you have to know that those steps are also important steps toward attaining ISO 9000 certification, and notice that you can use SPC(Statistical process control) and the control charts on step [4] and step [5] above.

Other than that i have just read the following interesting important paper about SPC(Statistical process control) that explains all the process of SPC(Statistical process control), so i invite you to read it
carefully:

https://owic.oregonstate.edu/sites/default/files/pubs/EM8733.pdf

So as you notice in the above paper that the central limit theorem
in mathematics is so important, but notice carefully that the necessary and important condition so that the central limit theorem works is that you have to use independent and random variables, and notice in the above paper that you have to do two things and it's that you have to reduce or eliminate the defects and you have to control the "variability" of the defects, and this is why the paper is talking about how to construct a control chart. Other than that the central limit theorem is not only related to SPC(Statistical process control), but it is also related to PERT and my PERT++ software project below, and notice that in my software project below that is called PERT++, i have provided you with two ways of how to estimate the critical path, first, by the way of CPM(Critical Path Method) that shows all the arcs of the estimate of the critical path, and the second way is by the way of the central limit theorem by using the inverse normal distribution function, and you have to provide my software project that is called PERT++ with three types of estimates that are the following:

Optimistic time - generally the shortest time in which the activity
can be completed. It is common practice to specify optimistic times
to be three standard deviations from the mean so that there is
approximately a 1% chance that the activity will be completed within
the optimistic time.

Most likely time - the completion time having the highest
probability. Note that this time is different from the expected time.

Pessimistic time - the longest time that an activity might require. Three standard deviations from the mean is commonly used for the pessimistic time.

And you can download my PERT++ from reading my following below thoughts:

More of my philosophy about the central limit theorem and about my PERT++ and more..

The central limit theorem states that the sampling distribution of the mean of any independent, random variable will be normal or nearly normal, if the sample size is large enough.

How large is "large enough"?

In practice, some statisticians say that a sample size of 30 is large enough when the population distribution is roughly bell-shaped. Others recommend a sample size of at least 40. But if the original population is distinctly not normal (e.g., is badly skewed, has multiple peaks, and/or has outliers), researchers like the sample size to be even larger. So i invite you to read my following thoughts about my software
project that is called PERT++, and notice that the PERT networks are referred to by some researchers as "probabilistic activity networks" (PAN) because the duration of some or all of the arcs are independent random variables with known probability distribution functions, and have finite ranges. So PERT uses the central limit theorem (CLT) to find the expected project duration.

And as you are noticing this Central Limit Theorem is also so important
for quality control, read the following to notice it(I also understood Statistical Process Control (SPC)):

An Introduction to Statistical Process Control (SPC)

https://www.engineering.com/AdvancedManufacturing/ArticleID/19494/An-Introduction-to-Statistical-Process-Control-SPC.aspx

Also PERT networks are referred to by some researchers as "probabilistic activity networks" (PAN) because the duration of some or all of the arcs are independent random variables with known probability distribution functions, and have finite ranges. So PERT uses the central limit theorem (CLT) to find the expected project duration.

So, i have designed and implemented my PERT++ that that is important for quality, please read about it and download it from my website here:

https://sites.google.com/site/scalable68/pert-an-enhanced-edition-of-the-program-or-project-evaluation-and-review-technique-that-includes-statistical-pert-in-delphi-and-freepascal

---


So I have provided you in my PERT++ with the following functions:


function NormalDistA (const Mean, StdDev, AVal, BVal: Extended): Single;

function NormalDistP (const Mean, StdDev, AVal: Extended): Single;

function InvNormalDist(const Mean, StdDev, PVal: Extended; const Less: Boolean): Extended;

For NormalDistA() or NormalDistP(), you pass the best estimate of completion time to Mean, and you pass the critical path standard deviation to StdDev, and you will get the probability of the value Aval or the probability between the values of Aval and Bval.

For InvNormalDist(), you pass the best estimate of completion time to Mean, and you pass the critical path standard deviation to StdDev, and you will get the length of the critical path of the probability PVal, and when Less is TRUE, you will obtain a cumulative distribution.


So as you are noticing from my above thoughts that since PERT networks are referred to by some researchers as "probabilistic activity networks" (PAN) because the duration of some or all of the arcs are independent random variables with known probability distribution functions, and have finite ranges. So PERT uses the central limit theorem (CLT) to find the expected project duration. So then you have to use my above functions
that are Normal distribution and inverse normal distribution functions, please look at my demo inside my zip file to understand better how i am doing it:

You can download and read about my PERT++ from my website here:

https://sites.google.com/site/scalable68/pert-an-enhanced-edition-of-the-program-or-project-evaluation-and-review-technique-that-includes-statistical-pert-in-delphi-and-freepascal

I think i am smart and i invite you to read carefully the following webpage of Alan Robinson Professor of Operations Management at University of Massachusetts and that is a full-time professor at the Isenberg School of Management of UMass and a consultant and book author specializing in managing ideas (idea-generation and idea-driven organization) and building high-performance organizations, creativity, innovation, quality, and lean management:

https://www.simplilearn.com/pgp-lean-six-sigma-certification-training-course?utm_source=google&utm_medium=cpc&utm_term=&utm_content=11174393172-108220153863-506962883161&utm_device=c&utm_campaign=Display-MQL-DigitalOperationsCluster-PG-QM-CLSS-UMass-YTVideoInstreamCustomIntent-US-Main-AllDevice-adgroup-QM-Desktop-CI&gclid=Cj0KCQiA3rKQBhCNARIsACUEW_ZGLHcUP2htLdQo46zP6Eo2-vX0MQYvc-o6GQP55638Up4tex85RBEaArn9EALw_wcB


And notice in the above webpage of the professor, that he is giving Post Graduate Program in Lean Six Sigma and on agile methodology, and i think that this Post Graduate Program is easy for me since i am really smart and i can easily understand lean Six Sigma or Six Sigma and i can easily understand agile methodology, and notice that i am in my below thoughts also explaining much more smartly what is agile methodology, and i think that the more difficult part of Six Sigma or lean Six Sigma is to understand the central limit theorem and to understand what is SPC(Statistical quality control) and how to use the control charts so that to control the variability of the defects, and notice that i am talking about it in my below thoughts, but i think that the rest of lean Six Sigma and Six Sigma is easy for me.


More of my philosophy about IQ tests and more of my thoughts..


I think i am highly smart, and I have passed two certified IQ tests and i have scored above 115 IQ, but i have just passed more and more IQ tests, and i have just noticed that the manner in wich we test with IQ tests is not correct, since in an IQ test you can be more specialized and above average in one subtest of intelligence than in another subtest of intelligence inside an IQ test, since IQ tests test for many kind of intelligences such as the spatial and speed and calculations and logical intelligence etc., so i think that you can be really above average in logical intelligence, as i am really above average in logical intelligence, but at the same time you can be below average in calculations and/or spatial.., so since an IQ test doesn't test for this kind of specializations of intelligence, so i think it is not good, since testing for this kind specializations in intelligence is really important so that to be efficient by knowing the strong advantages of this or that person in every types of intelligences. And about the importance of specialization, read carefully my following thought about it:


More of my philosophy about specialization and about efficiency and productivity..

The previous CEO Larry Culp of General Electric and the architect of a strategy that represented a new turning point in the world corporate strategies, Larry Culp's strategy was to divide the company according to its activities. Something like we are better of alone, seperately and
focused on each one's own activity, than together in a large
conglomerate. And it is a move from integration to specialization.
You see it is thought that a company always gains economies of scale
as it grows, but this is not necessarily the case, since as the company
gains in size - especially if it engages in many activities - it
also generates its own bureaucracy, with all that entails in term
of cost and efficiency. And not only that, it is also often the case
that by bringing together very different activities, strategic focus is lost and decision-making is diluted, so that in the end no one ends up
taking responsability, it doesn't always happen, but this reasons are
basically what is driving this increasing specialization. So i invite to look at the following video so that to understand more about it:

The decline of industrial icon of the US - VisualPolitik EN

https://www.youtube.com/watch?v=-hqwYxFCY-k


And here is my previous thoughts about specialization and productivity so that to understand much more:

More about the Japanese Ikigai and about productivity and more of my thoughts..

Read the following interesting article about Japanese Ikigai:

The More People With Purpose, the Better the World Will Be

https://singularityhub.com/2018/06/15/the-more-people-with-purpose-the-better-the-world-will-be/

I think i am highly smart, so i say that the Japanese Ikigai is like a Japanese philosophy that is like the right combination or "balance" of passion, vocation, and mission, and Ikigai and MTP, as concepts, urge us to align our passions with a mission to better the world, but i think that Japanese Ikiai is a also smart since it gets the "passion" from the "mission", since the mission is also the engine, so you have to align the passion with the mission of the country or the global world so that to be efficient, and Japanese Ikigai is also smart since so that to higher productivity and be efficient, you have to "specialize" in doing a job, but so that to higher more productivity and be more efficient you can also specialize in what you do "better", and it is what is doing Japanese Ikigai, since i think that in Japanese Ikigai, being the passion permits to make you specialized in a job in what you do better, and here is what i have just smartly said about productivity:

I think i am highly smart, and i have passed two certified IQ tests and i have scored above 115 IQ, and i will now talk about another important idea of Adam Smith the father of economic Liberalism, and it is about "specialization" in an economic system, since i say that in an economic system we have to be specialized in doing a job so that to be efficient and productive, but not only that, but we have to specialize in doing a job in what we do better so that to be even more efficient and productive, and we have to minimize at best the idle time or the wasting of time doing a job, since i can also say that this average idle time or wasting time of the workers working in parallel can be converted to a contention like in parallel programming, so you have to minimize it at best, and you have to minimize at best the coherency like in parallel programming so that to scale much better, and of course all this can create an economy of scale, and also i invite you to read my following smart and interesting thoughts about scalability of productivity:

I will talk about following thoughts from the following PhD computer scientist:

https://lemire.me/blog/about-me/

Read more carefully here his thoughts about productivity:

https://lemire.me/blog/2012/10/15/you-cannot-scale-creativity/

And i think he is making a mistake in his above webpage about productivity:

Since we have that Productivity = Output/Input

But better human training and/or better tools and/or better human smartness and/or better human capacity can make the Parallel productivity part much bigger that the Serial productivity part, so it can scale much more (it is like Gustafson's Law).

And it looks like the following:

About parallelism and about Gustafson’s Law..

Gustafson’s Law:

• If you increase the amount of work done by each parallel
task then the serial component will not dominate
• Increase the problem size to maintain scaling
• Can do this by adding extra complexity or increasing the overall
problem size

Scaling is important, as the more a code scales the larger a machine it
can take advantage of:

• can consider weak and strong scaling
• in practice, overheads limit the scalability of real parallel programs
• Amdahl’s law models these in terms of serial and parallel fractions
• larger problems generally scale better: Gustafson’s law


Load balance is also a crucial factor.


And read my following thoughts about Evolutionary Design methodology and that is so important so that to understand:

And I invite you to look at step 4 of my below thoughts of software Evolutionary Design methodology with agile, here it is:

4- When in agile a team breaks a project into phases, it’s called
incremental development. An incremental process is one in which
software is built and delivered in pieces. Each piece, or increment,
represents a complete subset of functionality. The increment may be
either small or large, perhaps ranging from just a system’s login
screen on the small end to a highly flexible set of data management
screens. Each increment is fully coded Sprints, Planning, and
Retrospectives.

And you will notice that it has to be done by "prioritizing" the pieces of the software to be delivered to the customers, and here again in agile you are noticing that we are also delivering prototypes of the software, since we often associate prototypes with nearly completed or just-before launch versions of products. However, designers create prototypes at all phases of the design process at various resolutions. In engineering, students are taught to and practitioners think deeply before setting out to build. However, as the product or system becomes increasingly complex, it becomes increasingly difficult to consider all factors while designing. Facing this reality, designers are no longer just "thinking to build" but also "building to think." By getting hands on and trying to create prototypes, unforeseen issues are highlighted early, saving costs related with late stage design changes. This rapid iterative cycle of thinking and building is what allows designers to learn rapidly from doing. Creating interfaces often benefit from the "build to think" approach. For example, in trying to layout the automotive cockpit, one can simply list all the features, buttons, and knobs that must be incorporated. However, by prototyping the cabin does one really start to think about how the layout should be to the driver in order to avoid confusion while maximizing comfort. This then allows the designer iterate on their initial concept to develop something that is more intuitive and refined. Also prototypes and there demonstrations are designed to get potential customers interested and excited.

More of my philosophy about the Evolutionary Design methodology and more..

Here are some important steps of software Evolutionary Design methodology:

1- Taking a little extra time during the project to write solid code and
fix problems today, they create a codebase that’s easy to maintain
tomorrow.

2- And the most destructive thing you can do to your project is to build
new code, and then build more code that depends on it, and then still
more code that depends on that, leading to that painfully familiar
domino effect of cascading changes...and eventually leaving you with
an unmaintainable mess of spaghetti code. So when teams write code,
they can keep their software designs simple by creating software
designs based on small, self-contained units (like classes, modules,
services, etc.) that do only one thing; this helps avoid the domino
effect.

3- Instead of creating one big design at the beginning of the project
that covers all of the requirements, agile architects use incremental
design, which involves techniques that allow them to design a system
that is not just complete, but also easy for the team to modify as
the project changes.

4- When in agile a team breaks a project into phases, it’s called
incremental development. An incremental process is one in which
software is built and delivered in pieces. Each piece, or increment,
represents a complete subset of functionality. The increment may be
either small or large, perhaps ranging from just a system’s login
screen on the small end to a highly flexible set of data management
screens. Each increment is fully coded Sprints, Planning, and
Retrospectives.

5- And an iterative process in agile is one that makes progress through
successive refinement. A development team takes a first cut
at a system, knowing it is incomplete or weak in some (perhaps many)
areas. They then iteratively refine those areas until the product is
satisfactory. With each iteration the software is improved through
the addition of greater detail.

More of philosophy about Democracy and the Evolutionary Design methodology..

I will make a logical analogy between software projects and Democracy,
first i will say that because of the today big complexity of software
projects, so the "requirements" of those complex software projects are
not clear and a lot could change in them, so this is
why we are using an Evolutionary Design methodology with different tools
such as Unit Testing, Test Driven Development, Design Patterns,
Continuous Integration, Domain Driven Design, but we have to notice
carefully that an important thing in Evolutionary Design methodology is
that when those complex software projects grow, we have first to
normalize there growth by ensuring that the complex software projects
grow "nicely" and "balanced" by using standards, and second we have to
optimize growth of the complex software projects by balancing between
the criteria of the easy to change the complex software projects and the
performance of the complex software projects, and third you have to
maximize the growth of the complex software projects by making the most
out of each optimization, and i think that by logical analogy we can
notice that in Democracy we have also to normalize the growth by not
allowing "extremism" or extremist ideologies that hurt Democracy, and we
have also to optimize Democracy by for example well balancing between
"performance" of the society and in the Democracy and the "reliability"
of helping others like the weakest members of the society among the
people that of course respect the laws.


More of my philosophy about the the importance of randomness in
the genetic algorithm and in the evolutionary algorithms and more
of my thoughts..

More of my philosophy about the genetic algorithm and about artificial intelligence and more of my thoughts..

I think i am highly smart, so i will ask the following philosophical question about the genetic algorithm:

Is the genetic algorithm a brute-force search and if it is
not, how is it different than the brute-force search ?

So i have just quickly took a look at some example of a minimization problem with a genetic algorithm, and i think that the genetic algorithm is not a brute-force search, since i think that when in a minimization
problem with a genetic algorithm you do a crossover, also called recombination, that is a genetic operator used to combine the genetic information of two parents to generate new offspring, the genetic algorithm has this tendency to also explore locally and we call it exploitation, and when the genetic algorithm does genetical mutations with a level of probability, the genetic algorithm has this tendency to explore globally and we call it exploration, so i think a good genetic algorithm is the one that balance efficiently exploration and exploitation so that to avoid premature convergence, and notice that when you explore locally and globally you can do it with a bigger population that makes it search faster, so it is is why i think the genetic algorithm has this kind of patterns that makes it a much better search than brute-force search. And so that to know more about this kind of artificial intelligence , i invite you to read my following thoughts in the following web link about evolutionary algorithms and artificial intelligence so that to understand more:

https://groups.google.com/g/alt.culture.morocco/c/joLVchvaCf0

More of my philosophy about the other conditions of the genetic algorithm and about artificial intelligence and more of my thoughts..
.
I think i am highly smart, and i think that the genetic algorithm
is interesting too, but i have to speak about one other most important thing about the genetic algorithm, so i will ask a philosophical question about it:

Since as i just said previously, read it below, that a good genetic algorithm has to efficiently balance between global(exploration) and local(exploitation) search , but how can you be sure that you have found a global optimum ?

I think i am smart, and i will say that it also depends on the kind of problem, so if for example we have a minimization problem, you can
rerun a number of times the genetic algorithm so that to select the best minimum among all the results and you can also give more time to
the exploration so that to find the a better result, also you have to know that the genetic algorithm can be more elitist in the crossover steps, but i think that this kind of Elitism can has the tendency to not efficiently higher the average best of the average members of the population, so then it depends on wich problem you want to use the genetic algorithm, also i think that the genetic algorithm is
a model that explains from where comes humans, since i also think
that the genetic mutations of humans, that happens with a probability, has also not only come from the inside body from the chromosomes and genes, but they also were the result of solar storms that, as has said NASA, that may have been key to life on Earth, read here so that to notice it:

https://www.nasa.gov/feature/goddard/2016/nasa-solar-storms-may-have-been-key-to-life-on-earth

I think i am highly smart, and i will invite you to read my following
smart thoughts about evolutionary algorithms and artificial intelligence so that you notice how i am talking about the so important thing that we call "randomness":

https://groups.google.com/g/alt.culture.morocco/c/joLVchvaCf0


So i think i am highly smart, and notice that i am saying in the above web link the following about evolutionary algorithms:

"I think that Modern trends in solving tough optimization problems tend
to use evolutionary algorithms and nature-inspired metaheuristic
algorithms, especially those based on swarm intelligence (SI), two major
characteristics of modern metaheuristic methods are nature-inspired, and
a balance between randomness and regularity."

So i think that in the genetic algorithm, there is a part that is hard coded, like selecting the best genes, and i think that it is what
we call regularity, since it is hard coded like that, but there is
a so important thing in that genetic algorithm that we call randomness,
and i think that it is the genetic mutations that happen with a
probability and that give a kind of diversity, so i think that this
genetic mutations are really important, since i can for example
say that if the best genes are the ones that use "reason", so then reason too can make the people that has the tendency to use reason do
a thing that is against there survival, like going to war when we
feel that there is too much risk, but this going to war can make
the members or people that use reason so that to attack the other enemy
be extinct in a war when they loose a war, and it is the basis of randomness in a genetic algorithm, since even when there is a war
between for example two Ant colonies, there are some members that do not make war and that can survive if other are extinct by making war, and i say it also comes from randomness of the genetics.

More of my philosophy about student performance and about artificial intelligence and more of my thoughts..


I have just read the following interesting article from McKinsey, and
i invite you to read it carefully:

Drivers of student performance: Asia insights

https://www.mckinsey.com/industries/education/our-insights/drivers-of-student-performance-asia-insights

And i think i am smart, and i think that the following factors in the above article that influence student performance are not so difficult to implement:

1- Students who receive a blend of inquiry-based and teacher-directed
instruction have the best outcomes

2- School-based technology yields the best results when placed in the
hands of teachers

3- Early childhood education has a positive impact on student scores,
but the quality and type of care is important

But i think that the factor that is tricky to implement (since it needs good smartness) is good motivation calibration that permits to score 8 to 14 percent higher on the science test than poorly calibrated one, and the high self-identified motivation that permits to score 6 to 8 percent higher.




Thank you,
Amine Moulay Ramdane.





0 new messages