None of them. Study Knuth. Study the Intel manuals. Study “The Art of Multiprocessor Programming”. Study compilers. Study TCP/IP. Study algorithms and data structures. Write lots of code for all of those things.
Disregard the noise. Ban yourself from reading blogs and magazines and tech news. Focus on what is fundamental to the field. Look where nobody else is looking.
Good knowledge of fundamentals translate well into high level work.
Understanding the timeless things deeply will serve you much better than continually trying to understand what is hyped only to have to constantly move on to the next hyped thing before being able to get more thana superficial understanding.
Before the internet, Intel used to distribute these manuals in hard-copy for free. One just had to drop by your local Intel sales office to pick up a copy. A good solid foot of shelf-space.
These manuals used to be so much easier to read back in the 486/Pentium era. One could almost build a complete mental model of how a 486 worked, and how to manually optimize code to best effect by avoiding processor stalls.
Since then, intel processors have accumulated an extraordinary amount of cruft, so it becomes much harder to develop a complete mental model. Compilers have also gotten a lot more clever as well, in order to deal with the added complexity of SIMD instruction sets.
For those of us who started with the 8086 Architecture manuals, each generation of processors added additional features which one learned by occasionally revisiting the architecture manuals for new processors.
Coming to the Architecture manuals without having the foundation of previous Architecture manuals as a basis must be a daunting task. But I'm sure there's rich material there anyway.
I miss the days of printed manuals, or even manuals at all. I killed a big software purchase years ago because they did not even have a PDF manual. The jerks told us to use their forums. No way I would spend money with such arrogant pr_cks. Needless to say, I find myself spending real coin on manuals on ebay. They are worth it.
I'd like to give you some advice based on my own experience, and without making too many assumptions about you and what you haven't told us about your situation.
I absolutely believe there is a way for you to get through this. Don't be discouraged!
The first thing you must do is to kill the negative thought train. Don't panic, don't assume your career is over, that you have no hope, that it's going to get worse, that people are going to find out, that you'll get fired, etc.. Whatever you're fearful about as a result of realizing you're burnt out, try to set it aside for now because you don't need any more stress or worry. It sounds like you have time to think about what to do, so take that time, and don't put yourself under any pressure. Accept the state you're in. It's ok. This happens to many, many people, and you're not the only one. Moreover, your world doesn't sound like it's about to end. Relax about it, as much as you can.
I would echo what some others have said here about telling your boss/team: don't. What I think you need to do is look at what your day-to-day job is like, and see if there's anything you want to change. Is there anything that's really dragging you down? Do certain tasks or projects drive you crazy or make you feel depressed? Are you unhappy with your role in general? Do you hate your office and need to move? Is WFH making you unhappy, and you need to be back in a real office? Try to ponder it all, see if there's something you could change that would give you some quality of life improvement. When you think you have something, take that to your boss, and see what they can do about it. Don't engage in self-enfeeblement when you have that talk. You're entitled to set boundaries for yourself and ask for a change without having to divulge anything about your personal state of mind. Just say plainly that you're not feeling happy in your current role, and you'd like to give something else a try (i.e., whatever alternative you're going to present). Ask if that's possible now or at some point soon. If not, ask if there's something else available right now that they could offer you because you feel like you really need a change. Consider the options, but only choose one if it appeals to you, not just because you think need to choose one.
If it's a hard no from the boss or if there are no good options, then ok, no problem. Consider your next options. A job is a job. If things aren't so bad where you're at, then maybe it's worth staying. Ask yourself if perhaps you are too emotionally invested in having a "great career". If so, work on detaching emotionally. Show up each day when you have to, and leave when you can. It's ok to do a good job, and do it for the pay check. You don't have to accomplish anything great. You don't have to break your back for your employer. If you stay where you're at, look at ways that you can change little things to give yourself some feeling of agency and autonomy. For me, I had schedule flexibility, so I engaged in a daily act of rebellion where I would sleep in, show up in time for the daily stand-up, then do what I could until I felt like I was done for the day, and then I went home. If you don't want to stay where you're at, then take time to think about what you want to do next.
I can recommend seeking professional help. A good therapist is a very valuable disinterested third-party who can help you think everything through. Unless your situation is in dire straits, the best way forward is to make small but well-considered steps that will help create emotional and psychological space for yourself to heal. Oftentimes, what we think is the one problem causing us to burn out isn't actually the only problem contributing to the situation. A therapist can give you perspective on that. Perhaps anti-depressants might be a help for you, but a therapist can help you figure that out too. Don't sign up for the drugs straight away unless it is clearly needed.
I can also recommend walking, every day if you can, for as long as you're able or it remains enjoyable for you. Wooded areas and around lakes are great to walk around, but anywhere will do. It will help you relax and mull things over. It really will. Listen to music if you like, but also try it without. Stop and sit along the way. Take in the sky, the birds, the trees, the landscape. Just enjoy being there, and the beauty of nature. Think about whatever's on your mind, but don't force yourself to think of a solution. Let your mind wander. Space out. Relax.
Over time, you will be able to figure out what you need to change. It took me a couple of years, to be honest, but that's fine. It's not a race. The building (i.e., my life) wasn't burning, so I just focussed on taking my time because I didn't want to change too much too soon and end up back in the same hole down at some point down the road. I took care to sleep well (but not too much), walk regularly, and try do things I enjoyed (or used to enjoy). No pressure. If I didn't feel like doing anything intellectual on a particular day, I didn't. I got a lot of mileage out of housework, cooking, and walking on those days.
I believe you can figure this out. Hang in there, my friend!
Without much experience in the industry, I would say that both perspectives are important:
The person most invested in your success is you, so leverage that and promote yourself.
But you absolutely deserve management and teammates who celebrate your accomplishments and help you get rewarded for them.
Discrimination and prejudice also affect this—it’s probably hard to advocate for yourself if your management just doesn’t believe you’re capable for some reason—but I’ve generally found solace in the synthesis of both attitudes.
Fight for yourself, but find people who fight with you. Maybe put it like this: if you yourself were a people manager, wouldn’t you want to advocate for your reports?
This is generally true, however in a toxic workplace being effective might set you as a target for doomed projects or envy - even sabotage. You may see people getting promoted by threatening to leave rather than doing good work. You can't assume good intent in every situation sadly.
When I switched to a different team a year ago, I talked with my ex-manager and asked what he considers be strengths and weaknesses. I found the answer quite funny because it was a random sample of mostly minor things. It showed my that my manager has actually no clue what I'm doing all day. He is a nice guy and wants to be a good manager but that is harder than it looks.
His biggest criticism was that 20 months earlier, I skipped him and addressed his bosses boss for some bureaucratic thing. I find that argument reasonable but it showed me that he was not aware of the full context. Either I never explained it to him or he forgot. The context is: At that time I was in a special two week task-force team where his bosses boss was officially involved as Scrum Master. As such he was officially responsible for impediments. The impediment was: We either get this bureaucracy thing out of the way today or I'm unable to participate in the task-force anymore. Given the urgency and him being our Scrum Master, I found skipping levels the right thing to do.
How ironic is it that the Haskell Research Compiler is written in Standard ML, and not Haskell? Joking aside, SML is a great language, and often overlooked, so this is good to see!
Haskell was the second frontend. Don't think we ever did a SML frontend. We can't really discuss what the original frontend was so it was scrubbed from the release.
Is that because it became an internal product with competitive advantage or something more boring? I've abstracted away from detail enough so you could hopefully answer.
It's not an internal product but I wouldn't call it boring either. There were some new ideas and some non-technical reasons why they couldn't be used at that time but they could still be viable and potentially something we wouldn't want public.
Not quite; Hotspot (Sun/Oracle JVM) and Android Runtime are both C++; the major C/C++ implementations (GCC, LLVM/Clang, MSVC, Intel C++ Compiler) are all C++ too, although GCC was C until 2010 and MSVC's 16-bit ancestors were as well.
1994-1998 was a bad time for C++. Stepanov had just come along with the STL idea, and there were various competing implementations of it. And I don't think I need to say anything more about the travesty of MFC other than to acknowledge its mere existence.
C++14 (the latest standard) is a far better language than C++ in the '90s, and it really does beat C when it comes to abstraction capabilities, type safety, and standard library functionality. C++11 also defined a standardized memory model which is extremely useful for writing multithreaded code.
Yes, C++ is not perfect -- the C++ standard library lacks a lot of functionality found in the libraries of other languages; there is a lot of complexity in the language that one must master to really "know" C++; the language itself has dark corners and disappointments; and so on. No language is perfect, and every language permits bad code.
I've been working with C++ full-time for the past three years. I didn't know much about the language before then (I was a "C 43var!" guy), but had heard all the horror stories about it. Now, I'm convinced that it's probably the best general purpose programming language available to date. It combines the full power of the machine with very expressive abstraction capabilities. Having seen what can be done with C++, I decided to study harder and try to master it.
I think this last point is ultimately what puts people off, and leads to a lot of FUD about C++ -- it takes hard work to master the language and the tools to work with it. Nobody really wants to do that, not when there are seemingly viable, and easier, alternatives. Instead, people are more willing to invest huge amounts of effort and money to try to scale up those alternatives if it means they can avoid the complexity of a language like C++. That's fine, I guess, we all have to make the appropriate engineering trade-offs, it's just unfortunate that many people fall into extremism about it in order to justify their attempt to simplify the reality of computing (which is actually considerably complex).
What comments in support of the NSA? Any comments in support of the NSA here are IMMEDIATELY flag-killed by the HN Nazis who love to suppress any kind of free speech they don't agree with.
Algorithmic trading is intrinsically neither good nor bad. It's kind of just a natural progression in the markets once easily programmable computers and high-speed networks came along.
How algorithmic trading is used, however, is another story. Humans are still the ones who bring the intent to technology, using it for good or evil. But it may interest you to know that a lot of malicious trading in the markets is not done by algorithmic traders, but by teams of manual traders working in concert to place manipulative trades that cause the market making algorithms to move the market in certain ways. (Source: I work at an exchange, and this is what our regulatory and compliance dept. says all the time.)
Market state -> price forecast
Price forecast + other factors (risk, liquidity) -> trading decision
If you know (or have a good guess) how the algorithm works, you can work out what state of the market would lead to a price forecast in your favour, i.e. would lead to the algorithm offering liquidity at a price favourable to you.
You then manipulate the state of the market to look like that (generally this is "spoofing" or "layering"), wait for the algorithm to respond, and then take advantage of the favourable liquidity it offers.
When people calibrate their algorithms, they use real market data. Most of the time, someone isn't actively manipulating the market, so the model is not calibrated to handle those situations.
I know someone who used to work at an algorithm trading firm and did electronic market making. He said they used to go to great lengths to program their algorithms to try to spot this sort of manipulation, but it's a hard problem because you're fighting against a cloud of bad traders who are coordinating their efforts across multiple market centers.
I have to say, the regulatory dept. at the exchange I work for does an excellent job of monitoring for any funny business. They have some nifty real-time tools that are scriptable and can replay the state of the market at any point. It's really cool stuff!
I think you've taken a pretty narrow interpretation of "reliable software". It seems more reasonable to think that Bjarne was speaking at a general level, i.e., the idea that software written in non-garbage collected languages tends to be prone to memory management errors on the part of the programmer.
This has been a common meme for the past 15 to 20 years precisely because it is so easy to forget when a block of memory needs to be freed. However, the mechanisms of reference-counted smart pointers, RAII, and clearer ownership semantics in the language, go a long way to help mitigate the common manual memory management problems in C++.
The downside, of course, is that you have to know how to use these ideas to write "reliable software", and C++ does not make it easy. It's pretty much impossible to go from reading the standard to implementing correct and optimal C++ programs. There are so many gotchas, corner-cases, and features which require much study and experience to truly understand.
I'm not arguing that you need to avoid dynamic memory management to write "reliable software." I'm just pointing out that a lot of reliable software is written that way, and thus the idea that you need garbage collection to write reliable software is so obviously false that it's silly to think anyone believes it to be true, making it a terrible straw man to argue against.
> I'm just pointing out that a lot of reliable software is written that way, and thus the idea that you need garbage collection to write reliable software is so obviously false that it's silly to think anyone believes it to be true, making it a terrible straw man to argue against.
You're also arguing a straw man. Of course you can write reliable software without dynamic allocation. The question is: can you do it faster and/or cheaper using C++ or $ALTERNATIVE?
(You mentioned rocket and spacecraft guidance software as examples. That's an example of software that's exceedingly expensive to develop... and it doesn't actually do that much even though it's obviously complex.)
You say, "Of course you can write reliable software without dynamic allocation." Why is that "of course," if the myth being addressed is that you cannot write reliable software without garbage collection? If you're saying everyone knows that you can write reliable software without GC, and it'll just be expensive and such, then we're in agreement, because that's exactly what I'm saying.
Neural networks are impressive only in that they are able to give any kind of meaningful results at all. In the end, they are only a poor mimicry of real machine intelligence, and not much better, conceptually, than plain old nonlinear regression.
Nobody has been able to determine what the structure of a neural network should look like for any given problem (network type, number of nodes, layers, activation functions), how many iterations of the parameter optimization algorithm are needed to achieve "optimal" results, and how "learning" is actually stored in the network.
Statistical learning methods are obviously still useful, but I think the field is still wide open for something to emerge that is closer to true machine intelligence.
Please, try implementing non-linear regression to understand images. Tell me how it goes.
Also, 'nonbody know hows learning is stored'? You very clearly have never worked with neural nets before. Experience is stored in the form of weight values.
Okay, you've got a neural net that does a really good job on identifying types of animals in pictures. unfortunately, whenever you show it a picture of a horse, it says 'fish'. Everything else, it's great at - marmosets, capybaras, dolphins, kangaroos; but it's got a complete blindspot for horses.
Where's the incorrect data stored? How can you fix it? It's in the weight values, somewhere, but you can't go and change the weight values to fix the horse/fish cascade without breaking everything else it knows.
Yes, we know 'where' the data is stored. But it's diffuse, not discrete, so we can't separate it from other data.
Umm even still, you can train it on more horse photos in order to increase its performance specifically on horses. Furthermore, you can study neron activation levels on said horse training data in order to reverse-engineer the neural "ravines" which the activations settle into. And run comparison tests about those ravines against the ravines for say zebras.
This is something actively being done by nn researches. And it lets us do things like take the low level audio processing part of a neural net trained on english voice data, and use it to train smarter neural nets on Portugese voice data than you couldn've without the English voice recordings.
Because I'm also one of the kooks from comp.lang.c where C is the One True Language.
But seriously, TXR is built on its own Lisp: an infrastructure which provides the managed environment and data representations which also support the TXR Lisp dialect.
This is no different from any Lisp implementation based on a C kernel, like CLISP, GNU Emacs, ...
If you do it from scratch, you lose a lot: you don't have a mature, optimized dynamic language implementation. But, by the same token, you can experiment in ways that you normally wouldn't. You get to dictate things like, oh, what is a cons cell. I have lazy conses that look like ordinary conses: they satisfy consp, and work with car, cdr, rplaca and rplacd. You can invent new evaluation rules. I came up with a way to have Lisp-1 and Lisp-2 in a single dialect, seamlessly, with the conveniences of both. I have Python-like array access. I made traditional Lisp list operations work with vectors and strings: you can mapcar through a string and so on. Sequences and hashes are functions. For instance orf is a combinator that combines functions analogously to the Lisp or operator. If hash1 and hash2 are hash tables, you can do something like [orf hash1 hash2 func] to create a function which takes one one-argument that will look that argument in hash1; then if that returns nil, it will try hash2, and if that returns nil, it will pass the key to func and return whatever that returns. Or ["abc" 1] returns the character #\b. [mapcar "abc" '(2 0 1)] yields "cab": the numeric indices are mapped through "abc", as if it were a index to character function. Fun things like this are good reasons to experiment with your dialect.
I believe TXR is a great companion if you're a Lisper working in ... one of those other environments.
Ah, one more thing. Well, two, or maybe three. Part of why I used C was to create a project whose tidy, clean internals stand in stark contrast to some of popular written-in-C scripting languages. You know, to sock it to them! See, there is a hidden agenda: the call of "I can do this better". If you use C, then a more direct comparison is possible. Secondly, people widely understand C. Give them a cleanly written project in C, and maybe they will hack on it, and from there understand something about Lisp too. C means low dependencies from the point of view of packaging: easy porting with just basic shell environment with make and a C compiler. Cross-compiling for ARM or whatever is a piece of cake. Easy work for package maintainers, ...
TXR is not built "on its own Lisp", it's built on C. If you believe that lisp is so great, then why didn't you just use ANSI Common Lisp? Why is TXR even necessary when I can do all the same data processing stuff in Perl, which is far more versatile and ubiquitous?
And all this nonsense about writing TXR in C because it's "more widely understood", "low dependencies", "easily packaged" - after 15-some years of advocacy in comp.lang.lisp, it's laughable that defsystem, asdf, and SBCL/CLISP/CMUCL aren't good enough for you.
Lisp is either as good as all the Naggums, Tiltons, and Pitmans of c.l.l. proclaim, or it's not. By writing TXR in C, you've just proved that it's not.
The point is that lisp advocates rarely seem to use any of these lisp implementations to do anything noteworthy or useful. They always seem to fall back on C, or some other language that's more "widely available" or "has minimal dependencies" or "has more potential contributors" or "can be more easily compared with other similar programs".
> The point is that lisp advocates rarely seem to use any of these lisp implementations to do anything noteworthy or useful.
That's possible. There are many Lisp dialects and implementations which have few applications. That's true for a lot of other language implementations, too. There are literally thousands implementations of various programming languages with very few actual applications. Maybe it is fun to implement your own language from the ground up. Nothing which interest me, but it does not bother me.
If he wants to implement a small new Lisp dialect its perfectly fine to implement it in C or similar.
> They always seem to fall back on C, or some other language that's more "widely available" or "has minimal dependencies" or "has more potential contributors" or "can be more easily compared with other similar programs".
Some new dialect is written with the help of C? That bothers you?
Wow.
Actually 95% of all Lisp systems contain traces of C and some are deeply integrated in C or on top of C (CLISP, ECL, GCL, CLICC, MOCL, dozens of Scheme implementations and various other Lisp dialects). There are various books about implementing Lisp in C.
Really nobody in the Lisp community loses any sleep that somebody implements parts of Lisp in C.
> I find this hypocrisy to be quite intriguing.
Because some random guys implement their own language in C? Why do we have Python, Ruby, Rebol? There was already PERL or AWK or ... Somebody decided to write their own scripting language. So what?
> Because some random guys implement their own language in C? Why do we have Python, Ruby, Rebol? There was already PERL or AWK or ... Somebody decided to write their own scripting language. So what?
When a Python advocate wants to do some data processing, do they first write their own Python implementation in C? No. When a Ruby advocate wants to make a Rails website, do they first write their own implementation of Ruby in C? No.
Several fine implementations of lisp already exist that compile down to machine code and, if the lisp community is to believed, have performance "close to C". So why does a lisp advocate feel the need to re-write lisp in C for a project that didn't actually need it? The lisp community would have us all believe that lisp is the "programmable programming language", and all the other rhetoric about how every other language has just stolen ideas from lisp, etc., etc.. They all truly seem to believe that lisp is something special. That's why I find it laughable that someone like Kaz Kylheku, a 15 year veteran of comp.lang.lisp, decided not to implement TXR by using a pre-existing lisp implementation.
> When a Python advocate wants to do some data processing, do they first write their own Python implementation in C?
They write it in C. Checkout the Python world sometimes.
* CrossTwine Linker - a combination of CPython and an add-on library offering improved performance (currently proprietary)
* unladen-swallow - "an optimization branch of CPython, intended to be fully compatible and significantly faster", originally considered for merging with CPython
* IronPython - Python in C# for the Common Language Runtime (CLR/.NET) and the FePy project's IronPython Community Edition
* 2c-python - a static Python-to-C compiler, apparently translating CPython bytecode to C
* Nuitka - a Python-to-C++ compiler using libpython at run-time, attempting some compile-time and run-time optimisations. Interacts with CPython runtime.
* Shed Skin - a Python-to-C++ compiler, restricted to an implicitly statically typed subset of the language for which it can automatically infer efficient types through whole program analysis
* unPython - a Python to C compiler using type annotations
* Nimrod - statically typed, compiles to C, features parameterised types, macros, and so on
and so on...
> So why does a lisp advocate feel the need to re-write lisp in C for a project that didn't actually need it? The lisp community would have us all believe that lisp is the "programmable programming language"
Why don't you understand the difference between 'a lisp advocate' and 'the lisp community'?
> nd all the other rhetoric about how every other language has just stolen ideas from lisp, etc., etc..
Nonsense.
> That's why I find it laughable that someone like Kaz Kylheku, a 15 year veteran of comp.lang.lisp, decided not to implement TXR by using a pre-existing lisp implementation.
Every single Python project you stated simply proves my point. They are Python compilers of some sort. TXR, on the other hand, is a data processing language implemented in its own lisp which is implemented in C. In other words, TXR is an application of lisp, not just a compiler or interpreter like those Python projects you listed. So, all your examples are irrelevant.
TXR didn't need its own dialect of lisp. So, the question remains: why didn't Kaz use SBCL or CLISP? They're good enough for c.l.l. kooks like him to recommend to everyone else, but why're they not good enough for him to use?
The kook here is you, and I can prove it: you have a bizarre view that developers should be divided into political parties based on programming language, and code strictly to the party lines. Bizarre views make the kook.
TXR does need its own dialect of Lisp because Common Lisp isn't suitable for slick data munging: not "out of the box", without layering your own tools on top of it.
This is a separate question from what TXR is written in. Even if TXR were written using SBCL, it would still have that dialect; it wouldn't just expose Common Lisp.
That dialect is sufficiently incompatible that it would still require writing a reader and printer from scratch, and a complete code walker to implement the evaluation rules of the dialect. Not to mention a reimplementation of most of the library. The dialect has two kinds of cons cells, so we couldn't use the host implementation's functions that understand only one kind of cons cell. So, whereas some things in TXR Lisp could be syntactic sugar on top of Common Lisp, with others it is not so.
Using SBCL would have many advantages in spite of all this, but it would also reduce many opportunities for me to do various low-level things from scratch. I don't have to justify to anyone that I feel like make a garbage collector or regex engine from scratch.
So, the reasons for not using "SBCL" have nothing to do with "good enough". It's simply about "not mine".
TXR is a form of Lisp advocacy.
TXR is also (modest) Lisp research; for instance I discovered a clean, workable way to have Lisp-1 and Lisp-2 in the same dialect, so any Lispers who are paying attention can stop squabbling over that once and for all.
Why we have Lisp today with all the features we take for granted is that there was a golden era of experimentation involving different groups working in different locations on their own dialects. For example, the MacLisp people hacked on MacLisp, and it wasn't because Interlisp wasn't good enough for them. Or vice versa.
> So, the reasons for not using "SBCL" have nothing to do with "good enough". It's simply about "not mine".
Kaz, the C programming language isn't yours either. My point is that Common Lisp is supposed to be a general purpose programming language with power far greater than a primitive language like C, but you chose to implement TXR in C simply because C makes it much easier for you to accomplish your goal than Common Lisp. I'm just trying to point out the obvious, which nobody from c.l.l. seems willing to admit.
It's a tool with some embedded kind of Lisp dialect. There are zillions of it.
> why didn't Kaz use SBCL or CLISP?
Why should he? He can do whatever he want. I personally don't care at all about what he does. Why are you? Kind of strange obsession with comp.lang.lisp. Are you one of the trolls posting there?
> They're good enough for c.l.l. kooks like him to recommend to everyone else, but why're they not good enough for him to use?
Probably he did it to annoy real programmers like you?
Kaz invested a bunch of time implementing a whole new backquote implementation for CLISP, but it's still not good enough for him to use CLISP to implement TXR? It doesn't make any sense!
Any right-thinking programmer should care about inconsistencies such as this. If I'm evaluating a programming language, and I see someone in its community writing their own language implementation to support an application that could've easily been written using one of the standard language implementations, then it looks to me like the standard implementations aren't mature enough or trustworthy enough for me to use for my application. Not only that, but it suggests that maybe this particular language isn't as good as its advocates claim, especially if I have to drop back down to C in order to meet certain requirements (e.g., portability, speed, wider understanding, etc.).
But any right-thinking programmer already knows that lisp is not worth wasting any time on. It's dead, and people like Kaz, and projects like TXR, are going to make sure it stays that way.
CLISP's licensing is somewhat confusing and appears to dictate the license to the application. So, for example, I probably wouldn't use it for a commercial, closed-source application. For the same reasons, it cannot be used for a BSD-licensed application.
(However, I did use CLISP for the licensing back-end of such an application: that back-end runs on a server and isn't redistributed. Things you don't distribute to others cannot run afoul of the GPL.)
CLISP's license lets you make compiled .fasl files, and these are not covered by its copyright (unless they rely on CLISP internal symbols). However, that is where it ends. Memory images saved with CLISP are under the GPL. (Memory images are the key to creating a stand-alone executable with CLISP!) If you have to add libraries to CLISP itself, you also run into the GPL. I believe that this would cause issues to the users of TXR, which they do not have today. For a user to be able to run the .fasl files, they need CLISP, and of course that has to be distributed to them under the GPL terms, and you can't add C libraries to that CLISP without taining them with the GPL.
You can wrap TXR entirely in a proprietary application, including all of its internals: the whole image, basically. This wouldn't be possible if some of its internals were the CLISP image.
Regarding the GPL, I do not believe in that any more. I will not use this license for any new project. It is not a free software license in my eyes. Free really means you can do anything you want; any restriction logically means "not entirely free". Proprietary products that use free code do not take away anyone's ability to use the original. The problem with the FSF people is that they regard the mere existence of something as offensive. "It's not enough that there is a free program; we must litigate to death the non-free program which is based on the same code before we can be happy."
Disregard the noise. Ban yourself from reading blogs and magazines and tech news. Focus on what is fundamental to the field. Look where nobody else is looking.