Hacker Newsnew | past | comments | ask | show | jobs | submit | rottc0dd's commentslogin

From my previous comment in hn:

As a java guy and think python is weird, I don't think this sucks.

But, I also agree that can serve as terrible intro to programming if you start programming right away without understanding the basics of abstractions. But, often when we have tools either designed for a purpose in mind or a dominant paridigm or reaction to existing set of tooling, this can result in understandable yet extreme abstractions.

Java is designed with OOP in mind and it kind of makes sense to have the user to think in terms of lego blocks of interfaces. Every method or class needs to have clear understanding of its users.

public - software handle is for all users

protected - software handle for current and extending classes

default - software is exposed to current package

private - software is restricted to be used in current class alone and nowhere else

So, the beginning of java programming starts with interface exposed to the user or other programmers. Is it weird and extreme. Yes. At least, it is consistent.


This heavyweight syntax has nothing do with OOP. No common definition of OOP (before Java came into existence) said that functions cannot exist outside of a class.

Java choose to adopt object oriented purity in a very weird way. Sun has initially rejected the idea of standalone functions (static import was introduced in Java 5, closures in Java 8 and compact source files/instance main is only coming now). Even with static imports and compact source files, there is still a class holding these methods behind the scenes (for practical and compatibility reasons, more than to achieve any ideological or theoretical goals at this point). That seems like Sun was trying to keep the purity of "everything is an object", but at the same time they've introduced primitive values, which are clearly not objects. This was a pretty bad decision in my opinion. There is no practical benefit from requiring every function to be defined inside a class[1]. On the other hand, being able to have methods defined on an integer is a pretty nice feature.

If we look at a Smalltalk, which is generally considered be an example of a "pure" OOP language, we see a striking difference. This is Hello World in Smalltalk:

  'Hello, world!' printNl
Smalltalk allows you to freely define functions outside of classes and even write code that is not part of a class (directly in the REPL).

[1] https://steve-yegge.blogspot.com/2006/03/execution-in-kingdo...


I am not sure of historical significance of what OOP is, but even Alan Kay seem to agree that modern definition of OOP is not what he intended[1]. But, for better or worse we are stuck with the principles.

We even have design patterns like Command, to workaround first class functions in "pure" OOPy way.

And for enterprise software development, I like it that way. It can make up a definition it wants and stick to it. I think it is better for a language's ecosystem and culture to have one dominant paradigm than becoming kitchen sink of programming languages.

Edit: added a link

[1] softwareengineering.stackexchange.com/questions/264697/alan-kay-the-big-idea-is-messaging


> Smalltalk allows you to freely define functions outside of classes

Please show an example.


I agree, this is the whole point of "Hello world", which is to show the boilerplate required to start a program capable of outputting text, as well as the actual code that outputs that text. It's also the chance to get the build tools setup, people forget that some of the 'boilerplate' is the actions required to build, which often are much more involved for newer tools and frameworks.

You can just say initially e.g. when I learned C++ that "#include <iostream>" can be "mostly ignored for now, but just know iostream stands for Input / Output stream, and this line is necessary to e.g. output with std::cout"; and there are no scars left from this.


Some scars... as cout << was always a bad idea, and taught people early on in their C++ development that being overly clever with operator overloading was expected


Which I think most people understood not to do themselves when they encountered their first issue with the precedence of << vs other operators in expressions they wanted to print.


Good ol' Kernighan strikes again [0]

[0] - https://www.laws-of-software.com/laws/kernighan/


Another thing that impedes us sunken cost fallacy. Classic "Simple vs easy" change. Even if a design is comparatively simpler, it is harder to make such change for small feature.

We had a project which is supposed to convert live objects back into code with autogenerated methods. The initial design was using a single pass over the object graph and creating abstractions of HDL and combining method blocks in the same pass.

That is a big hairy code with lot of issues. Simpler would be to handle one problem at a time - method generation in one pass and then convert the methods to HDL. But, getting approval for a deployed app is so hard. Particularly when it is a completer rewrite.


Nice work.

Still buggy. If you increase the ball size and increase the speed, the whole thing goes black/white in 10 seconds.


Top story: Kiro: new agentic IDE


Just add “agent” to the search box. It’s saved in local storage.


I just added "agent" to the default exclusion list.


"Onedrive is slow on Linux but fast with a “Windows” user-agent"

"Agents raid home of fired Florida data scientist who built Covid-19 dashboard"

"Confessions of an ex-TSA agent"

"Terrible real estate agent photographs"

etc etc



I'm not sure what I'm supposed to see there. From my point of view, this is a low-effort, vibe coded app that doesn't solve the problem the OP had but it's solving a different one. You'd need to at least train a small classifier based on something like BERT to actually address the issue. What I showed in my comment just demonstrates that this doesn't solve the problem OP had.


Still seeing `Kiro: A new agentic IDE` BTW.


If the filters UI at the top shows "llm, ai" instead of "llm, ai, agent" then you probably have that previous search saved in localStorage.


Huge respect for all your articles and work on llms, but this example should have been using AI to create a tool that uses AI to intelligently filter hacker news :)


Someone posted that last week.

https://www.hackernews.coffee/


Hi,

I think I have mentioned this before in HN too. I am not from CS background and just learnt the trade as I was doing the job, I mean even the normal stuff.

We have a project that tries reify live objects into human readable form. Final representation is so complicated with lot of types and the initial representation is less complicated.

In order to make it readable, if there is any common or similar data nodes, we have to compare and try to combine them i.e. find places that can be made into methods and find the relevant arguments for all the calls (kind of).

Initial implementation did the transformation into the final form first, and then started the comparison. So, the comparison have to deal with all the different combinations of the types we have in final representation now, which made the whole thing so complex and has been maintained by generation of engineers that nobody had clear idea how it was working.

Then, I read about hashmap implementation later (yep, I am that dumb) and it was a revelation. So, we did following things:

1. We created a hash for skeleton that has to remain the same through all the set of comparisons and transformation of the "common nodes", (it can be considered as something similar to methods or arguments) and doing the comparison for nodes with matching skeletal hashes and

2. created a separate layer that does the comparison and creating common nodes on initial primitive form and then doing the transformation as the second layer (so you don't have to deal with all types in final representation) and

3. Don't type. Yes. Data is simplest abstraction and if your logic can made into data or some properties, please do yourself a favor and make them so. We found lot of places, where weird class hierarchies can be converted into data properties.

Basically, it is a dumb multi pass decompiler.

That did not just speed up the process, but resulted in much more readable and understandable abstractions and code. I do not know, if this is widely useful but it helped in one project. There is no silver bullet, but types were actual problem for us and so we solved it this way.


Are cells not computers in some way? We are made of cells and cells work with chromosomes. Chromosomes are coded with ATGC pairs and each triplet is capable of creating proteins.

And the activation and deactivation of some triplet happens on response to presence of proteins. So, chromosomes are code and input and output is proteins. So, if our fundamental building blocks are computable in nature, what does it make us?


Physical systems are computable only in approximation. And quantum uncertainty throws another wrench into it. We also know that arbitrarily small rounding errors in the computation can lead to arbitrarily large differences with the actual system down the road. No, cells are not computers (in the sense of the Turing model). (However, that doesn’t mean that one can’t still consider them to be mechanistic and “soulless”.)


I meant to say in the way that there is well defined set of alphabets (A, T, G, C) and each triplet of these alphabet is responsible for specific protein to be created and combination of such protein make each cell what it is. (There are 20 different proteins for humans and we have four alphabets coming in triplets. So, if it was pair or quadreplets responsible for proteins, it would have too much or too little. They are not perfect but given the condition, there is some balance)

A single alphabet change in specific places can cause genetic defects like sickle cell anemia. And activation of which one has to generate protein (execute) is dependent on presence of certain things encoded as proteins again.

And viruses when enter a cell, the cell starts to execute viral genetic material. Even if these are not exactly Turing compatible, do they not mimic many aspects of computation?


There are some aspects that have some similarity to computation, but also many that are not. If your aim is “aren’t we really just computers”, that doesn’t actually work.

That’s not to say that computers couldn’t do what the brain does, including consciousness and emotions, but that wouldn’t have any particular relation to how DNA/RNA and protein synthesis works.


I did try to ask are we not computers. I tried to imply, in the fundamental level there are striking similarities to computation.

> That’s not to say that computers couldn’t do what the brain does, including consciousness and emotions,

Yes. Fundamental building blocks are simple and physical in nature and follow the computational aspect good enough to serve as nice approximations

> but that wouldn’t have any particular relation to how DNA/RNA and protein synthesis works.

Hmm... transistors are not neural networks so? I am sorry, I am a non native speaker and maybe I am not communicating things properly. I am trying to say, the organic or human is different manifestation of order - one is chemical and other is electronic. We have emotions and consciousness, but we can agree we are made of cells that send electric pulses to each other and primitive in nature. And even emotions and beliefs are physical in nature (Capgras syndrome for example).


> I did try to ask are we not computers.

I meant to say "I did not try to ask are we not computers."


> There are some aspects that have some similarity to computation, but also many that are not.

What I have explained is the exact way a chromosome works, it's raison d'etre. I think this cannot be dismissed as some aspect of it. It is its essence.


> (However, that doesn’t mean that one can’t still consider them to be mechanistic and “soulless”.)

How should we describe or approximate the things happening in cell?


I don’t know about “should”, but fundamentally we can describe them by the laws of physics.


IIRC, Gödel, Escher, Bach discusses comparing chromosome/protein generation and computation.


You might like Gene: An intimate history[0]. It was really good book.

[0]: https://www.amazon.com/Gene-Intimate-History-Siddhartha-Mukh...


I think there are some similar remarks on Bill Gates in another good memoir by Microsoft co-founder Paul Allen [1]. Even on his school days, Gates was so sure he will not have a competition on Math, since he was the best at math at his school. When he went to Harvard, (which I somehow remember as Princeton(!) as pointed out by a commenter) and saw people better than him, he changed to applied math from Pure math. (Remarks are Paul's)

> I was decent in math and Bill was brilliant, but I spoke from experience at Wazzu. One day I watched a professor cover the black board with a maze of partial differential equations, and they might as well have been hieroglyphics from the Second Dynasty. It was one of those moments when you realize, I just can’t see it. I felta little sad, but I accepted my limitations. I was OK with being a generalist.

> For Bill it was different. When I saw him again over Christmas break, he seemed subdued. I asked him about his first semester and he said glumly, “I have a math professor who got his PhD at sixteen.” The course was purely theoretical, and the homework load ranged up to thirty hours a week. Bill put everything into it and got a B. When it came to higher mathematics, he might have been one in a hundred thousand students or better. But there were people who were one in a million or one in ten million, and some of them wound up at Harvard. Bill would never be the smartest guy in that room, and I think that hurt his motivation. He eventually switched his major to applied math.

Even Paul admits, he was torn between going into Engineering or Music. But, when he saw his classmate giving virtuoso performance, he thought "I am never going to as great as this." So, he chose engineering.

Maybe it is a common trait in ambitious people.

Edits: Removed some misremembered information.

[1] https://www.amazon.com/Idea-Man-Memoir-Cofounder-Microsoft/d...


Huh. I remember being miles ahead of my peers in computer science in high school. When getting to college and finding people most definitely better than I was, I was incredibly excited to finally find such people, not scared away.


in my experience, people who grow up as the biggest fish in a small pond (whether concerning just fields they care about, or in general) are always 99% of the time, one of these two when they end up a middling fish in the big pond: like you, happy to find peers and inspiring exemplars to collaborate with and learn from, or those who hate that they are not the best anymore.

the former group probably leads the healthiest & happiest life fulfillment while pursuing their interests — i'm heavily biased though because i too fall into this category and am proud of this trait.

the latter group consists of people who either spin their wheels real hard and more often than not burn out in their pursuit of being the best, or pivot hard into something else they think they can be the best at (often repeatedly every time they encounter stronger competition) like gates & co, or in rare cases succeed in being the best even in the more competitive environment.

this last .001% are probably people whose egos get so boosted from the positive reinforcement that they become "overcompetitive" and domineering like zuck or elon, and let their egos control their power and resources to suppress competition rather than compete "fairly" ever again.

i think there's a subset of people from both main groups that may move from one into the other based on life experiences, luck, influence of people close to them, maturity, therapy, or simply wanting something different from life after a certain point. i don't have a good model for whether this is most people, or a tiny percentage.


I think the more common outcome you're not seeing, for the "other" group, is that they just go back to smaller ponds where they excelled in the first place, and often make strong contributions there.

Once it's been observed that there are bigger fish, you can't really go back to the naive sense of boundless potentiality, but you can go back to feeling like a strong and competent leader among people who benefit from and respect what you have.

Your comment focuses on the irrepressibly ambitious few who linger in the upper echelons of jet-setting academia and commerce and politics, trying to find a niche while constantly nagged by threats to their ego (sometimes succeeding, sometimes not), but there's many more Harvard/etc alum who just went back to Omaha or Baltimore or Denver or Burlington and made more or less big things happen there. That road is not so unhealthy or unhappy for them.


this is a very good point, and a blind spot in my comment because IME people who left the small pond in the first place were dissatisfied and unfulfilled there.

it is absolutely possible that after experiencing the bigger pond, people can develop purpose in their "original" pond based on values like community and relationships, or even simply dislike the vibes in bigger ponds and want to undo as much as they can. this is a super valuable thing to society and humanity for the most part, as perhaps more change can happen this way than big things happening in big places.

personally i struggle with this, because whenever i re-enter a smaller ecosystem (including/such as the one i grew up around) i feel like everyone has a distorted view of the bigger pond and self-limit themselves, which is a contagious energy i can't stand.


well put


In pure math at a school like Harvard, the standout kids like the ones in that quote are probably trying to become tenured math professors. There are very few such positions available. You can shoot for the stars, and if you succeed, make about the same as the average software engineer. More likely, get stuck a postdoc. So most students give up pure math at some point. If you realized you weren’t cut out for it in freshman year, you got a head start over the people who got a math phd before finding out the hard way.

This pressure didn’t exist in computer science because there were plenty of tech jobs for anyone competent (not sure if that’s still true in 2025). And you didn’t need to be a genius to build something cool.


Math can also be taught very young with compounding effect, but you’re very unlikely to be exposed to the coaching and expertise at a young age. Of course the few in the world who combine aptitude with exposure are the kind of people you will find at Harvard. If you’re not one of them you may be a decade behind.

I also had a math professor who believed in extreme differences within the research community. He said only a top advisor would actually be engaging with real research and be able to bring you with them.

> More likely, get stuck a postdoc.

I still can’t understand why the outcomes for math Phds are so bad. They have extremely general intelligence which is applicable to any jobs I’ve had. I think it’s some combination of being unable to sell, unable to explain what they do, and still having their aspirations defined by professors.


It's because it's considered settling for lesser to "sell out to industry."

Kinda reminds me of the old "amateur athlete" paradigm.

It's not that you can't get a good job with a math PhD, it's that you can't get a good job and the respect of your peers/community. I'm sure there are plenty of companies that would be thrilled to hire math PhDs, they just don't also offer a ton of opportunities to work on cutting edge (math) research and publish papers.


Excuse me for generalizing the point. That's not fair to do just based on these anecdotes. But, I can also understand their perspective.

Paul continued to be a guitar player all his life and hosted jamming sessions in his home. I started with piano very late in my life and not very regular, but I am just happy to join the fun party.


Congratulations on learning piano. I think everyone who is capable of learning an instrument should consider it.

Rachmaninoff once said, "Music is enough for a lifetime, but a lifetime is not enough for music." So, no matter when one starts, there would never be enough time to truly master the craft.

I believe it is better for one to start late and enjoy it than start early and burnout.


Thanks a lot. It is really fun. But, I don't have adult company in my neighborhood.

If take "What if I don't became great with this" anxiety out of the equation, it feels just more fun and life seems a little more colorful being a beginner.


That’s not a common reaction with humans. When people are the best, there’s a huge serotonin rush. Like literally this is measurable in humans.

Serotonin regulates dominance hierarchies and is associated with happiness. It’s so biological in nature that the same effect can be witnessed in lobsters. People or lobsters high in dominance have more serotonin and are generally happier.

Your story is not only anomalous. But it’s anomalous to the point where it’s unrealistic too. I can’t comment on this but if you did not feel the associated come down of serotonin I’m more inclined to say you’re not being honest with yourself more then you’re a biological anomaly. There’s likely enough variation in genetics to produce people like you so I’m not ruling it out.


It sounds like the commenter above is just less insecure about themselves and more excited for opportunities to discuss and learn than you and whoever you're describing here are.


No im saying dominance hierarchies are the natural order of things and it’s ingrained in biology.

Pretending that hierarchy doesn’t matter and that you don’t care where you are in that hierarchy is lying to yourself.

It’s like saying the janitor is equal in respect to the software engineer. We don’t like to admit but the janitor is less respected and looked down upon. I’m annoyed by people who pretend it doesn’t matter.


I don’t know if some people are just wired differently, but I can back up the feeling of not caring at all where I fall in a hierarchy or how much people respect or don’t respect me.

The things I find most thrilling always relate to being challenged. Finding someone better than me qualifies. Having ideas challenged or being proven wrong are the most positive experience I’ve had, especially being forced to change deeply held beliefs. I mention this because it’s one of those things that I always hear people say that everyone hates, but I’ve always felt the opposite, just from a pure chemical feeling perspective. I don’t think I could possibly be unique in that experience.


Human instinct is a complex of different things acting in opposite directions, including things that work against hierarchy.

I'm shocked that you think this is an unbelievable reaction, I know lots of people who really do think like that.

I wonder if you might find C S Lewis's lecture on the "inner ring" interesting.

https://archive.org/details/1944-the-inner-ring


I don’t think they said anything about their serotonin. They just described their reaction to the situation. If we were able to ask lobsters about their self-experience we might learn something about them too.


You sound like Jordan Peterson.


A less unflattering interpretation might be that once they saw the level of skill required to contribute to a field, they switched to a field that they could more meaningfully contribute to.


I think the reality though is you don't need to be in the top 99.999% to contribute to a field, you just need a unique take/voice. Trying to be the best at anything is a bad strategy in a connected world


Yeah, but these are also about people who are not even starting off at a field. These are teenagers. It really stood out that they can think where they can make most impact in the world at such an young age.


Agreed, it's very impressive. The distribution of capability in the human race is incredible.


What are you talking about? Our society harasses every teenager to think again and again and give definite answers to exactly that kind of question. It's completely normal and exactly like every other young person.


Especially for smart kids who are used to getting in that positive feedback cycle of rewards and admiration.


And to understand that there are people who are much better, to internalize it and change the major also requires some intelligence. I wish I had that insight instead of banging my head against the walls, barely passing while others sailed through and continued to Phd with half my effort.


There’s a very very similar story about Jeff bezos and physics.

https://youtu.be/eFnV6EM-wzY?si=Nc_EqhXEFJVuQWS6

I’m not making this up. Seems like a shared personality trait among these people.


I’m pretty sure Gates went to Harvard, not Princeton.


You are right. I should have looked it up.

> I was decent in math and Bill was brilliant, but I spoke from experience at Wazzu. One day I watched a professor cover the black board with a maze of partial differential equations, and they might as well have been hieroglyphics from the Second Dynasty. It was one of those moments when you realize, I just can’t see it. I felta little sad, but I accepted my limitations. I was OK with being a generalist.

> For Bill it was different. When I saw him again over Christmas break, he seemed subdued. I asked him about his first semester and he said glumly, “I have a math professor who got his PhD at sixteen.” The course was purely theoretical, and the homework load ranged up to thirty hours a week. Bill put everything into it and got a B. When it came to higher mathematics, he might have been one in a hundred thousand students or better. But there were people who were one in a million or one in ten million, and some of them wound up at Harvard. Bill would never be the smartest guy in that room, and I think that hurt his motivation. He eventually switched his major to applied math.


> Even Paul admits, he was torn between going into Engineering or Music. But, when he saw his classmate giving virtuoso performance, he thought "I am never going to as great as this." So, he chose engineering.

Coincidentally, I had a very similar experience, and made a similar decision to switch to software engineering. However, the irony is that I am also just a bad, if not worse, at software engineering. Oh well, not a day goes by that I regret my decision.


"Oh well, I'm not going to be Andres Segovia, so I guess I will never pick up a guitar."

I think that attitude comes from people who are deeply unhappy. They need therapy.


When I was 18 years old and a new classical guitar student, I was very fortunate to hear the Maestro in concert. I even got to meet him briefly afterward because my music professor had some connection to him.

I was blown away at the time by what was possible and that, even though he was very old at the time and had to be led out onstage by the arm, needed help getting seated, and had the guitar placed in his lap, what he could still play was so far advanced of anyone in my class who were all in attendance.

The temptation (and I have felt this many times since then after hearing various guitarists) could have been "I should just quit now because I'll never be that good." But I'm glad I didn't succumb to that and decided that "I'd rather not sound like anyone else" and still feeling pleasure and accomplishment from playing on my own terms.


I wonder if our professors knew each other?

My classical guitar instructor was well acquainted with Segovia, and he himself, was a student of Julian Bream. However, my instructor was without a doubt one of the most angry people I think I have ever interacted with. He was somewhat better known for his arrangements and less so as a performer.

> "I should just quit now because I'll never be that good."

I never had to think about this because my instructor would often tell me this. XD


Mastery comes with age, no way around that.

https://www.youtube.com/watch?v=5pi7WcHqBNU - Here are some bits of wisdom from Japanese master chefs, both young and old.


In light of this, it's weird how the software industry, especially startup culture, is so rife with age discrimination.


Experienced people see through b.s. and push back. Less experienced people are simply easier to exploit. And whether or not the current job market allows us proper perspective, a large part of our working population is exploited. That’s how capitalism works in practice.


It's not necessary for a work of fiction to focus on diverse and realistic characters, particularly when its primary aim is to critique a specific aspect of technology. In such cases, characters often function as just means to highlight and amplify that central theme.

Take 1984. It reads like a thought experiment reflecting the author's deepest fears about the dangers of unchecked power structures. Allegedly, Orwell’s own son would have been around 40 years old in the year 1984 (I read so in Pynchon's introduction to this book in Penguin's edition. It was a great essay.)

But, 1984 also features a great protagonist and an absolutely haunting language. While many of the other characters mainly serve to convey the broader ideas, it’s him who grounds the story emotionally. His suffering, his moral collapse, and the eventual loss of his ability was so tough to read and will forever haunt me. When he breaks, it feels like a loss for all of humanity. But, what I mean is characters are not essential to make a great work. When Orwell wants to convey his ideas, the characters are sidelined and ideas take the front wheel.

I understand your perspective. I'm not a fan of many of the episodes either. I really liked the first season, but the ones that followed just didn’t live up to it. And it does not rise above a horror centered around some particular technology. But, it's them give it cultural relevance.


> But, 1984 also features a great protagonist and an absolutely haunting language. While many of the other characters mainly serve to convey the broader ideas, it’s him who grounds the story emotionally. His suffering, his moral collapse, and the eventual loss of his ability was so tough to read and will forever haunt me. When he breaks, it feels like a loss for all of humanity. But, what I mean is characters are not essential to make a great work. When Orwell wants to convey his ideas, the characters are sidelined and ideas take the front wheel.

This paragraph goes one way and then suddenly pivots to the opposite conclusion without any justification. Orwell's character is why the story is wrenching. Without that emotional weight it has no staying power.


I kind of think both are true. I will remember Winston as great thinker who is extremely aware his world. And the tragedy or death of him is death of his awareness. His ability to think. In all the protagonist I have seen in tragedies, he is peculiar. While reviewing one another writer's work, Orwell said

> ‘... was a bad writer, and some inner trouble, sharpening his sensitiveness, nearly made him into a good one; his discontent healed itself, and he reverted to type. It is worth pausing to wonder in just what form the thing is happening to oneself.’

In the first act, the writing was so cold and I could not feel any connection to Winston. Even, when getting intimate with Julia, he is thinking,

> In the old days, he thought, a man looked at a girl’s body and saw that it was desirable, and that was the end of the story. But you could not have pure love or pure lust nowadays. No emotion was pure, because everything was mixed up with fear and hatred. Their embrace had been a battle, the climax a victory. It was a blow struck against the Party. It was a political act.

I don't know when I started to feel things and empathize with him so much. When you think about circumstances and how he feels, he is cold as it gets, always scheming.

And in the most hopeful time of his life, he say these

> ‘We are the dead,’ he said.

> ‘We’re not dead yet,’ said Julia prosaically.

> ‘Not physically. Six months, a year – five years, conceivably. I am afraid of death. You are young, so presumably you’re more afraid of it than I am. Obviously we shall put it off as long as we can. But it makes very little difference. So long as human beings stay human, death and life are the same thing.’

But, when you think of an inner life, he has one of the richest and rare ones. We empathize with that, and when crystal ball falls, it was the most tragic thing I have experienced. I think, genius of Orwell is that he made the character and the idea indistinguishable.


I meant to type, it was one of the tragic things I have read, but too late to edit.


From my other comment elsewhere. These resources helped me understand the topics better.

If anyone wants to understand fundamentals of machine learning, one of the superb resources I have found is, Stanford's "Probability for computer scientists"[1].

It goes into theoretical underpinnings of probability theory and ML, IMO better than any other course I have seen. But, this is a primarily a probability course that discusses the fundamentals of machine learning. (Yeah, Andrew Ng is legendary, but his course demands some mathematical familiarity with linear algebra topics)

There is a course reader for CS109 [2]. You can download pdf version of this. Caltech's learning from data was really good too, if someone is looking for theoretical understanding of ML topics [3].

There is also book for excellent caltech course[4].

Also, neural networks zero to hero is for understanding how neural networks are built from ground up [5].

[1] https://www.youtube.com/watch?v=2MuDZIAzBMY&list=PLoROMvodv4...

[2] https://chrispiech.github.io/probabilityForComputerScientist...

[3] https://work.caltech.edu/telecourse

[4] https://www.amazon.com/Learning-Data-Yaser-S-Abu-Mostafa/dp/...

[5] https://www.youtube.com/watch?v=VMj-3S1tku0&list=PLAqhIrjkxb...


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: