Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

More philosophy of what is it to be a genius ?

1 view
Skip to first unread message

amin...@gmail.com

unread,
Feb 20, 2020, 5:43:24 PM2/20/20
to
Hello,


More philosophy of what is it to be a genius ?

You have just noticed that i said in one of my previous posts that i am "like" a genius, but you have to understand the "like", that doesn't mean that i am a genius, but perhaps that i am a genius ! but i will try to make you understand "more" what is it to be a genius: i think that you have to understand the following rule that says: You have to know how to minimize at best complexity by maximizing at best efficiency ! and this is the rule that i am following, and i think this rule is the most important goal that a genius wants to attain ! and when you understand it , you will become much more smart ! a genius wants first to do from simple things or simple few things great things ! so you have to understand it, so it brings us to the question of how to make from simple things, or simple few things, great things ? and this rule looks like minimizing at best complexity by maximizing at best efficiency ! so to understand better what looks like in practice this rule, read my following thoughts:

About smartness and how to measure it..

I think that in IQ tests smartness is measured by a level of difficulty,
so you have to solve a certain level of difficulty. So it is relative to
a certain level of difficulty of the IQ tests such as those of Mensa or such. But i think if we are smart and we lower the level of difficulty by for example easying the understanding of knowledge and science and technics, so i think that we are going to become more smart and more capable at smartness. This is why i also say the following:

Now today i will do political philosophy about Pedagogy..

Can we ask the question of: What must be an efficient Pedagogy today?,
i think that we have to notice that the exponential progress
and the law of accelerating returns are influencing Pedagogy,
because today we have to be more efficient at learning,
because i think that the level of sophistication of today
is much more than the past and i think that this has
as a cause that we have to be much more sophisticated than
the past to be able to be efficient learning, i think that this is
the "first" requirement of today Pedagogy, and more than that
we have to define the other main parts of what is it to be efficient learning, i mean what are the main parts of "efficient learning" that have a great importance ? take for example choosing the right tools for
learning , so the first question of how to choose the right tools for learning is to know how to choose the right tools, not only that but to be able for the tool to be the right tool, the learning of the right tool must be made more "easier", now you are noticing
that the parts of efficient learning are becoming more clear,
this is why i said before in my political philosophy
the following that permits you to understand more what i am saying:

What is happening in my brain ?

I will speak more about me so that you understand my way of doing since
it is also like my philosophy, to be more smart you have to be capable
of reducing efficiently "complexity" so that to be efficient and so that
to be more successful, but how can you do it ? you have to be able
to know about the steps that guides you into the right direction,
first i will speak about my way so that you understand me better,
first step you have to be able to "prioritize" efficiently, because
to be able to be successful you have to prioritize, so look for example
at me, i have decided to "study" more and to study more "efficiently" so that to be more successful, but this is not "sufficient" to know, because to be able to be efficient at reducing complexity you have to be able to be efficiently selective of your knowledge, and this efficiently "selective" has to adhere in its turn to the process of being efficient at reducing "complexity", so you have to be able to select "efficiently" "efficient" knowledge that is more "easy" to learn so that to reduce complexity, thus you have to be able to ask "questions" to this or that right persons to be able to be efficient at selecting your "knowledge", next step you have to be "tenacity" at studying efficiently and you have to study more and more and you have to ask questions to your professors and next step after you have been able to learn more and more you have to be able at being efficient at "reusability" of your efficient knowledge and this is a very important step , so don't neglect efficient "reusability" of your knowledge, this is also the steps that i have followed and i have also used my "smartness" to be more efficient.

Also i said the following about smartness:

I think there is a diversity in the set that we call people, but
we have to understand that to be able to classify them by smartness,
we have to understand that there is not only "IQ" as we know it
that can measure smartness, because i say that there is a missing important part, this is why i say the way of measuring by IQ is still fuzzy, because i say that statistically we can see that a missing important part is that a human can be genetically like more articulated in his thinking and that is like more constancy of more calculations and/or more constancy of more logic and more constancy of more rationality, it is like being like a computer that calculates more and that is logical more and that is rationality more, and this how i am in real life, and this is also the missing part of smartness that is not measured by IQ because it is more dynamic.

And here is also what i said previously:

I am like a genius, because i have invented many scalable algorithms and i am inventing many scalable algorithms, you have to see me to believe, this is why i am talking here, i have showed you some of my scalable algorithms that i have invented, but this is not all, because i have invented many other scalable algorithms that i have not showed you here, and here is one more new invention that i have just
invented right now:

If you have noticed i have just implemented my EasyList here:

https://sites.google.com/site/scalable68/easylist-for-delphi-and-freepascal

But i have just enhanced its algorithm to be scalable in the Add() method and in the search methods, but it is not all , i will use for
that my just new invention that is my generally scalable counting networks, also its parallel sort algorithm will become much much more scalable , because i will use for that my other invention of my fully my scalable Threadpool, and it will use a fully scalable parallel merging algorithm , and read below about my just new invention of generally scalable counting networks:

Here is my new invention of a scalable algorithm:

I have just read the following PhD paper about the invention that we call counting networks and they are better than Software combining trees:

Counting Networks

http://people.csail.mit.edu/shanir/publications/AHS.pdf

And i have read the following PhD paper:

http://people.csail.mit.edu/shanir/publications/HLS.pdf

So as you are noticing they are saying in the conclusion that:

"Software combining trees and counting networks which are the only techniques we observed to be truly scalable"

But i just found that this counting networks algorithm is not generally scalable, and i have the logical proof here, this is why i have just come with a new invention that enhance the counting networks algorithm to be generally scalable. And i think i will sell my new algorithm of a generally scalable counting networks to Microsoft or Google or Embarcadero or such
software companies.

So you have to be careful with the actual counting networks algorithm that is not generally scalable.

My other new invention is my scalable reference counting and here it is:

https://sites.google.com/site/scalable68/scalable-reference-counting-with-efficient-support-for-weak-references

And my other new invention is my scalable Fast Mutex that is really powerful, and here it is:

About fair and unfair locking..

I have just read the following lead engineer at Amazon:

Highly contended and fair locking in Java

https://brooker.co.za/blog/2012/09/10/locking.html

So as you are noticing that you can use unfair locking that can have starvation or fair locking that is slower than unfair locking.

I think that Microsoft synchronization objects like the Windows critical section uses unfair locking, but they still can have starvation.

But i think that this not the good way to do, because i am an inventor and i have invented a scalable Fast Mutex that is much more powerful , because with my Fast Mutex you are capable to tune the "fairness" of the lock, and my Fast Mutex is capable of more than that, read about it on my following thoughts:

More about research and software development..

I have just looked at the following new video:

Why is coding so hard...

https://www.youtube.com/watch?v=TAAXwrgd1U8


I am understanding this video, but i have to explain my work:

I am not like this techlead in the video above, because i am also an "inventor" that has invented many scalable algorithms and there implementions, i am also inventing effective abstractions, i give you an example:

Read the following of the senior research scientist that is called Dave Dice:

Preemption tolerant MCS locks

https://blogs.oracle.com/dave/preemption-tolerant-mcs-locks

As you are noticing he is trying to invent a new lock that is preemption tolerant, but his lock lacks some important characteristics, this is why i have just invented a new Fast Mutex that is adaptative and that is much much better and i think mine is the "best", and i think you will not find it anywhere, my new Fast Mutex has the following characteristics:

1- Starvation-free
2- Tunable fairness
3- It keeps efficiently and very low its cache coherence traffic
4- Very good fast path performance
5- And it has a good preemption tolerance.

this is how i am an "inventor", and i have also invented other scalable algorithms such as a scalable reference counting with efficient support for weak references, and i have invented a fully scalable Threadpool, and i have also invented a Fully scalable FIFO queue, and i have also invented other scalable algorithms and there implementations, and i think i will sell some of them to Microsoft or to Google or Embarcadero or such software companies.


Thank you,
Amine Moulay Ramdane.




0 new messages