Softpanorama

Home Switchboard Unix Administration Red Hat TCP/IP Networks Neoliberalism Toxic Managers
May the source be with you, but remember the KISS principle ;-)
Skepticism and critical thinking is not panacea, but can help to understand the world better

Python -- Scripting language with generators and coroutines

News Scripting Languages

Best Python books for system administrators

Recommended Links Python Braces Debate Command-Line Syntax and Options pdb — The Python Debugger
Tutorials Coroutines Braces Problem Programming environment Python IDEs Pycharm IDE Jython
Python istallation Compiling Python from source Installing Python 3 from RPMs Installing Python Packages      
Debugging in Python Algorithms Quotes Python history Tips Etc  
 
A new competitor seemed to emerge out of the woodwork every month or so. The first thing I would do, after checking to see if they had a live online demo, was look at their job listings. After a couple years of this I could tell which companies to worry about and which not to. The more of an IT flavor the job descriptions had, the less dangerous the company was.
  • The safest kind were the ones that wanted Oracle experience. You never had to worry about those.
    You were also safe if they said they wanted C++ or Java developers.
  • If they wanted Perl or Python programmers, that would be a bit frightening-- that's starting to sound like a company where the technical side, at least, is run by real hackers.
  • If I had ever seen a job posting looking for Lisp hackers, I would have been really worried.

-- Paul Graham co-founder, Viaweb

This is the forth page of an ongoing series of pages covering major scripting language topics for the Unix/Linux system administrators  (others cover Unix shells, Perl, and  TCL). based of Professor Bezroukov lectures.  

Python now is becoming the language Unix/Linux sysadmin must know at least on superficial level as for many users it is the primary language iether for development or for writing supporting scripts. Python has been hailed as a really easy to learn language. It's not completely true, but it get a strong foothold at universities (repeating path of Pascal in this regard, but on  a new level). The rest is history.

As most sysadmins know Perl, Python should be easy to learn at basic level, as key concepts are similar and Python is the language influenced by Perl, incorporating some European and, more specifically, Niklaus Wirth ideas about programming languages. Python's core syntax and certain aspects of its philosophy were also influenced from ABC

But in reality is is pretty difficult and even annoying to learn for accomplished Perl programmers. You feel like a accomplished ball room dancer put on the ice ring. You need to re-learn a lot of things. And fall a lot of times.  Moreover, all this hype  about Python being easier to read and understand is what is it: programming language hype. They reflect one thing: inability to distinguish "programming in the small" from "programming in the large". I think Philic J Guo put it well (Philip Guo - Hey, your Python code is unreadable! (

I argue that code written in Python is not necessarily any more easily readable and comprehensible than code written in other contemporary languages, even though one of Python's major selling points is its concise, clear, and consistent syntax. While Python code might be easier to comprehend 'in-the-small', it is not usually any easier to comprehend 'in-the-large'.

Moreover Python program often suffer from abuse of OO (sometimes to a horrible level), leading to OO-spaghetti and programs several times more complex than they should be.

Python programs can be decomposed into modules, statements, expressions, and objects, as follows:

  1. Programs are composed of modules.
  2. Modules contain statements.
  3. Statements contain expressions.
  4. Expressions create and process objects.

First versions of Python did not introduced any new ideas and by-and large was just more cleaner rehash of ideas Perl with the addition of idea of modules from Module 3.  It was released in 1991, the same year as Linux. Wikipedia has a short article about Python history:  History of Python - Wikipedia, the free encyclopedia

But starting with version 2.2 it added support of semi-coroutines in a platform independent way (via generators, the concept inspired by Icon) and became class of its own.  I think that this makes Python in certain areas a better scripting language then other members of the "most popular troika" (Perl, PHP and JavaScript). Availability of semi-coroutines favorably distinguish Python from Perl.

Python was lucky that for some time it enjoyed full support of Google (which still employs Python creator,  Guido van Rossum). In addition Microsoft also supported it indirectly (via Iron Python and support In Visual Studio and other tools like Expressions Web).  That created some money flow for development and put Python in much better position then Perl, which after it lost O'Reilly does not have powerful corporate sponsor and this development is lingering in obscurity.

Even in 2017 Python still enjoy some level of support of Google, and that are no similar sponsor for iether Ruby or R, or Perl.  Out of modern languages only  JavaScript survived a semi-abandoned status (after collapse of Netscape),  but it paid heavy price both in terms of speed of development of the language and popularity :-(.

Some of the main reasons that Python is so popular is so called "first language effect"-- language that university taught to students as the  first language has tremendous advantage over alternatives.

Python was adopted as the first language for teaching programming at many universities, and that produces steady stream of language users and some enthusiasts. 

Python interpreter has interactive mode, suitable for small examples and it more or less forgiving (no obligatory semicolon at the end of the statements like in Perl, not C-style curvy bracket delimited blocks (the source of a lot of grief for beginners as missing  curvy bracket is difficult to locate). Also it has more or less regular lexical structure and simple syntax (due to its very complex lexical structure and syntax Perl is as horrible as a beginner language; although semantic if Perl is better, more understandable,  for beginners then semantic of Python).  And after the language found its niche in intro university courses  the proverb "nothing succeed like success" is fully applicable.

Overcomplexity

Despite some claim that Python adhere to simplicity this is simply is not tire. This is a large non-orthogonal language, not that different  in this respect from Perl just with the different set of warts. Python is a large language and a decent textbook such as  Learning Python, 5th Edition by Mark Lutz (2013) is  over thousand pages. Modest intro like Introducing Python Modern Computing in Simple Packages by Bill Lubanovic is 484 pages. Slightly more expanded intro  Python Essential Reference (4th Edition)  by David Beazley is over 700 pagers. O'Reilly cookbook is 706 pages. And so on and so forth. Humans can't learn such a large language and need to operate with  a subset.

Both Perl and Python belong to the class of language with an attitude of "whatever gets the job done" although Python pretends (just pretends) to follow KISS principle: on the words (but  not in practice) Python designers seems preferring simplicity and consistency in design to flexibility that Perl advocates.  

2.7 vs. 3.6 problem

There is no single Python language. There are two dialects which are often called 3.x and 2.x.

Currently Python 2.7 is the dominant version of Python for scientific and engineering computing (although the standard version that comes with RHEL 6.x is still python 2.6). 64-bit version is dominant . But 3.x version is promoted by new books and gain in popularity too.  It is now taught in universities, which instantly pushed it into mainstream. Infatuation with Java in US universities ended, and ended for good, because there is nothing interesting in Java -- it is one step forward and two step back, kind of Cobol for XXI century.

Python 3 has  better support for coroutines (here is quote form Fluent Python, chapter 16):

The infrastructure for coroutines appeared in PEP 342 — Coroutines via Enhanced Generators, implemented in Python 2.5 (2006): since then, the yield keyword can be used in an expression, and the .send(value) method was added to the generator API. Using .send(…), the caller of the generator can post data that then becomes the value of the yield expression inside the generator function. This allows a generator to be used as a coroutine: a procedure that collaborates with the caller, yielding and receiving values from the caller.

In addition to .send(…), PEP 342 also added .throw(…) and .close() methods that respectively allow the caller to throw an exception to be handled inside the generator, and to terminate it. These features are covered in the next section and in “Coroutine Termination and Exception Handling”.

The latest evolutionary step for coroutines came with PEP 380 - Syntax for Delegating to a Subgenerator, implemented in Python 3.3 (2012). PEP 380 made two syntax changes to generator functions, to make them more useful as coroutines:

These latest changes will be addressed in “Returning a Value from a Coroutine” and “Using yield from”.

Let’s follow the established tradition of Fluent Python and start with some very basic facts and examples, then move into increasingly mind-bending features.
 

Modules and OO

The key Python feature is the ability to user modules. In this sense it is a derivative of Modular 3.  OO features are bolted on top of this.

While Python provides OO features, like C++ can be used without them.  They were added to language, not present from the very beginning. And like in many other languages with OO features they became fashionable and promoted the language. Also OO features are badly abused in Python scripts -- I saw many cases when Linux/Unix maintenance script were written using OO features. which makes them less maintainable and  the code more verbose. 

While pointers to memory structures (aka objects) is how OO is implemented, unlike Perl Python it does not provide pointers as a separate data type.  You can use pointers via object oriented framework but generally this is a perversion.  In a decent language pointers should be present as a separate data type.

Python is shipped with all version of Linux not other Unix flavors

Currently Python is shipped as standard component only with Linux and FreeBSD. Nether Solaris, not AIX or HP-UX include Python by default (but they do include Perl).

Quantity and quality of library of available modules

By the number and quality of available modules Python is now competitive and in some areas exceeds Perl with its famous CPAN. Like Perl, Python also has large library of standard modules shipped with the interpreter. Among them

But Python also faces competition from more modern languages such  as Ruby and R. Although still less popular Ruby competes with Python, especially for among programmers who value coroutines paradigm of software development (and it is really paradigm, not just a language feature). Python provides generators that are close enough, but still...

Ecosystem

Currently Python has the most developed ecosystem among all scripting languages with a dozen of high quality IDE available (Perl has almost none, although Komodo Edit can be used as a proxy) . Probably pycharm being  the most popular (and it has a free version for individual developers)

Starting from version 2.7 debugger is decent and supports almost the same set of operation as famous Perl debugger. In other words at last it manage to bridge the gap with Perl debugger.  Better late then never ;-).

There are probably more  books published about Python then any other scripting language. In 2017 books about Python are still baked like there is no tomorrow, but most of them are of poor or average quality.  Especially written  by OO zealots. This amount of "junk books" is an interesting and pretty unique feature of the language.

Many Python books should be avoided at all costs as they do not explain the language by obscure it.  You are warned.

One sign of the popularity of a scripting language is availability of editors which use it as macro language. Here Python outshine Perl by several orders of magnitude

see PythonEditors - Python Wiki

Komodo is a high quality middleweight free editor  that supports writing macros in Python.  See Macro API Reference

Python get a foot into numeric computing

Python also get a foot into numeric computing and is via SciPy/NumPy. It is now widely used in molecular modeling area, which was previously dominated by compiled language (with Fortran being the major). In genomic applications it also managed partially displace Perl (although quality of regular expression integration into the language is much lower in Python), but now R is pushing Python out in this domain.

Indentation as a proxy for nesting -- questionable design decision

Python imitates FORTRAN 4 -- it uses an indentation to denote nesting, similarly like Fortran 4 used it to distinguish between labels and statements ;-).

That creates problems  if tabs are used as editor and Python interpreter might have different settings for tabs. And that can screw nesting of statements.  This also created problems with diffs and putting patches with patch. 

Multiline statements in Python are iether detected by unbalanced brackets or can be explicitly marked with a backslash. Ability to close multiple blocks by just changing indentation is a plus only for short program, visible on  the screen. At the  same time possibility of mixing tabs and blanks for indentation is a horror show. You need specifically check if your Python program accidentally contain tabs and convert them to blanks. 

By relegating block brackets to the lexical level of blank space and comments Python failed to make a necessary adjustment and include pretty printer into interpreter (with the possibility of creating pretty printed program from pseudo-comments). Such a pretty printer actually needs to understand two things: the format of comments that suggest indenting like

#{ <label>

#} <label>

and the current number of spaces in the tab like  pragma tab = 4. The interesting possibility is that in pretty printed program those comments can be removed and after a reading pretty printed program into the editor reinstated automatically. Such feature can be implemented in any scriptable editor.

My impression is that few people understand that C  solution for blocks ({ } blocks) was pretty weak in comparison with its prototype language (PL/1): it does not permit nice labeled closer of blocks like

A:do ... end A;

in PL/1. IMHO introduction of a pretty printer as a standard feature of both a compiler and a GUI environment is long overdue and here Python can make its mark.

By aditing indentation as proxy for nesting Python actually encourages a programmer to use a decent editor, but we knew that already, right?  This design decision also  narrows down the possible range of coding styles and  automatically leads to more compact (as for the number of lexical tokens) programs (deletion of curly brackets usually help to lessen the number of lines in C or Perl program by 20% or more). 

Difficulties of adapting to the language for Perl programmers

Although Python as a  scripting language used Perl as prototype and its features are generally similar to Perl, Perl programmers experience difficulties adapting to the language.  They're not overly similar in implementation details, nor even remotely similar in syntax. Their extension mechanisms also took different directions. 

Final notes

We all understand that in real life better language seldom win (look at Java).  Luck plays tremendous role in  determining languages popularity. Best commercially supported language that satisfy current fashion has better chances. Python managed  to ride the wave of enthusiasm toward OO programming, which (by-and-large) unfair relegated Perl to the second class languages.  And it is not a bad scripting language so in a way success of Python is our success too.

Python now also has several different implementation of the interpreter, which are a clear sign of both popularity and maturity of the language:  Along with CPython interpreter (which is standard) there is quite popular Jython  which uses JVM and thus integrates well with Java, and Iron Python which is Microsoft implementation (Python -- programming language)

The mainstream Python implementation, also known as CPython, is written in C compliant to the C89 standard, and is distributed with a large standard library written in a mixture of C and Python. CPython ships for a large number of supported platforms, including Microsoft Windows and most modern Unix-like systems. CPython was intended from almost its very conception to be cross-platform; its use and development on esoteric platforms such as Amoeba alongside more conventional ones like Unix or Macintosh has greatly helped in this regard.

Stackless Python is a significant fork of CPython that implements microthreads. It can be expected to run on approximately the same platforms that CPython runs on.

There are two other major implementations: Jython for the Java platform, and IronPython for the .NET platform. PyPy is an experimental self-hosting implementation of Python, in Python, that can output a variety of types of bytecode, object code and intermediate languages.

Several programs exist to package Python programs into standalone executables, including py2exe, PyInstaller, cx_Freeze and py2app.

Many Python programs can run on different Python implementations, on such disparate operating systems and execution environments, without change. In the case of the implementations running on top of the Java virtual machine or the Common Language Runtime, the platform-independence of these systems is harnessed by their respective Python implementation.

Many third-party libraries for Python (and even some first-party ones) are only available on Windows, Linux, BSD, and Mac OS X.

There is also a dialect called Stackless Python which adds support for coroutines, communication channels and task serialization.

Python also has better interface with C programs than Perl, which allow to write extension modules in C.

Nikolai Bezroukov


Top Visited
Switchboard
Latest
Past week
Past month

NEWS CONTENTS

Old News ;-)

[Jun 14, 2019] The Twilight of Equality Neoliberalism, Cultural Politics, and the Attack on Democracy by Lisa Duggan

Notable quotes:
"... For example, she discusses neoliberal attempts to be "multicultural," but points out that economic resources are constantly redistributed upward. Neoliberal politics, she argues, has only reinforced and increased the divide between economic and social political issues. ..."
"... Because neoliberal politicians wish to save neoliberalism by reforming it, she argues that proposing alternate visions and ideas have been blocked. ..."
Jun 14, 2019 | www.amazon.com

S. Baker 5.0 out of 5 stars Summary/Review of Twilight of Equality November 27, 2007

Duggan articulately connects social and economic issues to each other, arguing that neoliberal politics have divided the two when in actuality, they cannot be separated from one another.

In the introduction, Duggan argues that politics have become neoliberal - while politics operate under the guise of promoting social change or social stability, in reality, she argues, politicians have failed to make the connection between economic and social/cultural issues. She uses historical background to prove the claim that economic and social issues can be separated from each other is false.

For example, she discusses neoliberal attempts to be "multicultural," but points out that economic resources are constantly redistributed upward. Neoliberal politics, she argues, has only reinforced and increased the divide between economic and social political issues.

After the introduction, Duggan focuses on a specific topic in each chapter: downsizing democracy, the incredible shrinking public, equality, and love and money. In the first chapter (downsizing democracy), she argues that through violent imperial assertion in the Middle East, budget cuts in social services, and disillusionments in political divides, "capitalists could actually bring down capitalism" (p. 2).

Because neoliberal politicians wish to save neoliberalism by reforming it, she argues that proposing alternate visions and ideas have been blocked. Duggan provides historical background that help the reader connect early nineteenth century U.S. legislation (regarding voting rights and slavery) to perpetuated institutional prejudices.

[Jun 14, 2019] Mean Girl Ayn Rand and the Culture of Greed by Lisa Duggan

Notable quotes:
"... From the 1980s to 2008, neoliberal politics and policies succeeded in expanding inequality around the world. The political climate Ayn Rand celebrated—the reign of brutal capitalism—intensified. Though Ayn Rand’s popularity took off in the 1940s, her reputation took a dive during the 1960s and ’70s. Then after her death in 1982, during the neoliberal administrations of Ronald Reagan in the United States and Margaret Thatcher in the United Kingdom, her star rose once more. (See chapter 4 for a full discussion of the rise of neoliberalism.) ..."
"... During the global economic crisis of 2008 it seemed that the neoliberal order might collapse. It lived on, however, in zombie form as discredited political policies and financial practices were restored. ..."
"... We are in the midst of a major global, political, economic, social, and cultural transition — but we don’t yet know which way we’re headed. The incoherence of the Trump administration is symptomatic of the confusion as politicians and business elites jockey with the Breitbart alt-right forces while conservative evangelical Christians pull strings. The unifying threads are meanness and greed, and the spirit of the whole hodgepodge is Ayn Rand. ..."
"... The current Trump administration is stuffed to the gills with Rand acolytes. Trump himself identifies with Fountainhead character Howard Roark; former secretary of state Rex Tillerson listed Adas Shrugged as his favorite book in a Scouting magazine feature; his replacement Mike Pompeo has been inspired by Rand since his youth. Ayn Rand’s influence is ascendant across broad swaths of our dominant political culture — including among public figures who see her as a key to the Zeitgeist, without having read a worth of her writing.’’ ..."
"... Rand biographer Jennifer Burns asserts simply that Ayn Rand's fiction is “the gateway drug” to right-wing politics in the United States — although her influence extends well beyond the right wing ..."
"... The resulting Randian sense of life might be called “optimistic cruelty.” Optimistic cruelty is the sense of life for the age of greed. ..."
"... The Fountainhead and especially Atlas Shrugged fabricate history and romanticize violence and domination in ways that reflect, reshape, and reproduce narratives of European superiority' and American virtue. ..."
"... It is not an accident that the novels’ fans, though gender mixed, are overwhelmingly white Americans of the professional, managerial, creative, and business classes." ..."
"... Does the pervasive cruelty of today's ruling classes shock you? Or, at least give you pause from time to time? Are you surprised by the fact that our elected leaders seem to despise people who struggle, people whose lives are not cushioned and shaped by inherited wealth, people who must work hard at many jobs in order to scrape by? If these or any of a number of other questions about the social proclivities of our contemporary ruling class detain you for just two seconds, this is the book for you. ..."
"... As Duggan makes clear, Rand's influence is not just that she offered a programmatic for unregulated capitalism, but that she offered an emotional template for "optimistic cruelty" that has extended far beyond its libertarian confines. Mean Girl is a fun, worthwhile read! ..."
"... Her work circulated endlessly in those circles of the Goldwater-ite right. I have changed over many years, and my own life experiences have led me to reject the casual cruelty and vicious supremacist bent of Rand's beliefs. ..."
"... In fact, though her views are deeply-seated, Rand is, at heart, a confidence artist, appealing only to narrow self-interest at the expense of the well-being of whole societies. ..."
Jun 14, 2019 | www.amazon.com

From the Introduction

... ... ...

Mean Girls, which was based on interviews with high school girls conducted by Rosalind Wiseman for her 2002 book Queen Bees and War/tubes, reflects the emotional atmosphere of the age of the Plastics (as the most popular girls at Actional North Shore High are called), as well as the era of Wall Street's Gordon Gekko, whose motto is “Greed is Good.”1 The culture of greed is the hallmark of the neoliberal era, the period beginning in the 1970s when the protections of the U.S. and European welfare states, and the autonomy of postcolonial states around the world, came under attack. Advocates of neoliberalism worked to reshape global capitalism by freeing transnational corporations from restrictive forms of state regulation, stripping away government efforts to redistribute wealth and provide public services, and emphasizing individual responsibility over social concern.

From the 1980s to 2008, neoliberal politics and policies succeeded in expanding inequality around the world. The political climate Ayn Rand celebrated—the reign of brutal capitalism—intensified. Though Ayn Rand’s popularity took off in the 1940s, her reputation took a dive during the 1960s and ’70s. Then after her death in 1982, during the neoliberal administrations of Ronald Reagan in the United States and Margaret Thatcher in the United Kingdom, her star rose once more. (See chapter 4 for a full discussion of the rise of neoliberalism.)

During the global economic crisis of 2008 it seemed that the neoliberal order might collapse. It lived on, however, in zombie form as discredited political policies and financial practices were restored. But neoliberal capitalism has always been contested, and competing and conflicting political ideas and organizations proliferated and intensified after 2008 as well.

Protest politics blossomed on the left with Occupy Wall Street, Black Lives Matter, and opposition to the Dakota Access oil pipeline at the Standing Rock Sioux reservation in the United States, and with the Arab Spring, and other mobilizations around the world. Anti-neoliberal electoral efforts, like the Bernie Sanders campaign for the U.S. presidency, generated excitement as well.

But protest and organizing also expanded on the political right, with reactionary populist, racial nationalist, and protofascist gains in such countries as India, the Philippines, Russia, Hungary, and the United States rapidly proliferating. Between these far-right formations on the one side and persistent zombie neoliberalism on the other, operating sometimes at odds and sometimes in cahoots, the Season of Mean is truly upon us.

We are in the midst of a major global, political, economic, social, and cultural transition — but we don’t yet know which way we’re headed. The incoherence of the Trump administration is symptomatic of the confusion as politicians and business elites jockey with the Breitbart alt-right forces while conservative evangelical Christians pull strings. The unifying threads are meanness and greed, and the spirit of the whole hodgepodge is Ayn Rand.

Rand’s ideas are not the key to her influence. Her writing does support the corrosive capitalism at the heart of neoliberalism, though few movers and shakers actually read any of her nonfiction. Her two blockbuster novels, 'The Fountainpen and Atlas Shrugged, are at the heart of her incalculable impact. Many politicians and government officials going back decades have cited Rand as a formative influence—particularly finance guru and former Federal Reserve chairman Alan Greenspan, who was a member of Rand's inner circle, and Ronald Reagan, the U.S. president most identified with the national embrace of neoliberal policies.

Major figures in business and finance are or have been Rand fans: Jimmy Wales (Wikipedia), Peter Thiel (Paypal), Steve Jobs (Apple), John Mackey (Whole Foods), Mark Cuban (NBA), John Allison (BB&T Banking Corporation), Travis Kalanik (Uber), Jelf Bezos (Amazon), ad infinitum.

There are also large clusters of enthusiasts for Rand’s novels in the entertainment industry, from the 1940s to the present—from Barbara Stanwyck, Joan Crawford, and Raquel Welch to Jerry Lewis, Brad Pitt, Angelina Jolie, Rob Lowe, Jim Carrey, Sandra Bullock, Sharon Stone, Ashley Judd, Eva Mendes, and many more.

The current Trump administration is stuffed to the gills with Rand acolytes. Trump himself identifies with Fountainhead character Howard Roark; former secretary of state Rex Tillerson listed Adas Shrugged as his favorite book in a Scouting magazine feature; his replacement Mike Pompeo has been inspired by Rand since his youth. Ayn Rand’s influence is ascendant across broad swaths of our dominant political culture — including among public figures who see her as a key to the Zeitgeist, without having read a worth of her writing.’’

But beyond the famous or powerful fans, the novels have had a wide popular impact as bestsellers since publication. Along with Rand’s nonfiction, they form the core texts for a political/ philosophical movement: Objectivism. There are several U.S.- based Objectivist organizations and innumerable clubs, reading groups, and social circles. A 1991 survey by the Library of Congress and the Book of the Month Club found that only the Bible had influenced readers more than Atlas Shrugged, while a 1998 Modern Library poll listed The Fountainhead and Atlas Shrugged as the two most revered novels in English.

Atlas Shrugged in particular skyrocketed in popularity in the wake of the 2008 financial crash. The U.S. Tea Party movement, founded in 2009, featured numerous Ayn Rand—based signs and slogans, especially the opening line of Atlas Shrugged: “Who is John Galt?” Republican pundit David Frum claimed that the Tea Party was reinventing the GOP as “the party of Ayn Rand.” During 2009 as well, sales of Atlas Shrugged tripled, and GQ_magazine called Rand the year’s most influential author. A 2010 Zogby poll found that 29 percent of respondents had read Atlas Shrugged, and half of those readers said it had affected their political and ethical thinking.

In 2018, a business school teacher writing in Forbes magazine recommended repeat readings: “Recent events — the bizarro circus that is the 2016 election, the disintegration of Venezuela, and so on make me wonder if a lot of this could have been avoided bad we taken Atlas Shrugged's message to heart. It is a book that is worth re-reading every few years.”3

Rand biographer Jennifer Burns asserts simply that Ayn Rand's fiction is “the gateway drug” to right-wing politics in the United States — although her influence extends well beyond the right wing.4

But how can the work of this one novelist (also an essayist, playwright, and philosopher), however influential, be a significant source of insight into the rise of a culture of greed? In a word: sex. Ayn Rand made acquisitive capitalists sexy. She launched thousands of teenage libidos into the world of reactionary politics on a wave of quivering excitement. This sexiness extends beyond romance to infuse the creative aspirations, inventiveness, and determination of her heroes with erotic energy, embedded in what Rand called her “sense of life.” Analogous to what Raymond Williams has called a “structure of feeling,” Rand’s sense of life combines the libido-infused desire for heroic individual achievement with contempt for social inferiors and indifference to their plight.5

Lauren Berlant has called the structure of feeling, or emotional situation, of those who struggle for a good life under neoliberal conditions “cruel optimism”—the complex of feelings necessary to keep plugging away hopefully despite setbacks and losses.'’ Rand's contrasting sense of life applies to those whose fantasies of success and domination include no doubt or guilt. The feelings of aspiration and glee that enliven Rand’s novels combine with contempt for and indifference to others. The resulting Randian sense of life might be called “optimistic cruelty.” Optimistic cruelty is the sense of life for the age of greed.

Ayn Rand’s optimistic cruelty appeals broadly and deeply through its circulation of familiar narratives: the story of “civilizational” progress, die belief in American exceptionalism, and a commitment to capitalist freedom.

Her novels engage fantasies of European imperial domination conceived as technological and cultural advancement, rather than as violent conquest. America is imagined as a clean slate for pure capitalist freedom, with no indigenous people, no slaves, no exploited immigrants or workers in sight. The Fountainhead and especially Atlas Shrugged fabricate history and romanticize violence and domination in ways that reflect, reshape, and reproduce narratives of European superiority' and American virtue.

Their logic also depends on a hierarchy of value based on radicalized beauty and physical capacity — perceived ugliness or disability' are equated with pronounced worthlessness and incompetence.

Through the forms of romance and melodrama, Rand novels extrapolate the story of racial capitalism as a story of righteous passion and noble virtue. They retell The Birth of a Ntation through the lens of industrial capitalism (see chapter 2). They solicit positive identification with winners, with dominant historical forces. It is not an accident that the novels’ fans, though gender mixed, are overwhelmingly white Americans of the professional, managerial, creative, and business classes."


aslan , June 1, 2019

devastating account of the ethos that shapes contemporary America

Ayn Rand is a singular influence on American political thought, and this book brilliantly unfolds how Rand gave voice to the ethos that shapes contemporary conservatism. Duggan -- whose equally insightful earlier book Twilight of Equality offered an analysis of neoliberalism and showed how it is both a distortion and continuation of classical liberalism -- here extends the analysis of American market mania by showing how an anti-welfare state ethos took root as a "structure of feeling" in American culture, elevating the individual over the collective and promoting a culture of inequality as itself a moral virtue.

Although reviled by the right-wing press (she should wear this as a badge of honor), Duggan is the most astute guide one could hope for through this devastating history of our recent past, and the book helps explain how we ended up where we are, where far-right, racist nationalism colludes (paradoxically) with libertarianism, an ideology of extreme individualism and (unlikely bed fellows, one might have thought) Silicon Valley entrepreneurship.

This short, accessible book is essential reading for everyone who wants to understand the contemporary United States.

Wreck2 , June 1, 2019
contemporary cruelty

Does the pervasive cruelty of today's ruling classes shock you? Or, at least give you pause from time to time? Are you surprised by the fact that our elected leaders seem to despise people who struggle, people whose lives are not cushioned and shaped by inherited wealth, people who must work hard at many jobs in order to scrape by? If these or any of a number of other questions about the social proclivities of our contemporary ruling class detain you for just two seconds, this is the book for you.

Writing with wit, rigor, and vigor, Lisa Duggan explains how Ayn Rand, the "mean girl," has captured the minds and snatched the bodies of so very many, and has rendered them immune to feelings of shared humanity with those whose fortunes are not as rosy as their own. An indispensable work, a short read that leaves a long memory.

kerwynk , June 2, 2019
Valuable and insightful commentary on Rand and Rand's influence on today's world

Mean Girl offers not only a biographical account of Rand (including the fact that she modeled one of her key heroes on a serial killer), but describes Rand's influence on neoliberal thinking more generally.

As Duggan makes clear, Rand's influence is not just that she offered a programmatic for unregulated capitalism, but that she offered an emotional template for "optimistic cruelty" that has extended far beyond its libertarian confines. Mean Girl is a fun, worthwhile read!

Sister, June 3, 2019

Superb poitical and cultural exploration of Rand's influence

Lisa Duggan's concise but substantive look at the political and cultural influence of Ayn Rand is stunning. I feel like I've been waiting most of a lifetime for a book that is as wonderfully readable as it is insightful. Many who write about Rand reduce her to a caricature hero or demon without taking her, and the history and choices that produced her seriously as a subject of cultural inquiry. I am one of those people who first encountered Rand's books - novels, but also some nonfiction and her play, "The Night of January 16th," in which audience members were selected as jurors – as a teenager.

Under the thrall of some right-wing locals, I was so drawn to Rand's larger-than-life themes, the crude polarization of "individualism" and "conformity," the admonition to selfishness as a moral virtue, her reductive dismissal of the public good as "collectivism."

Her work circulated endlessly in those circles of the Goldwater-ite right. I have changed over many years, and my own life experiences have led me to reject the casual cruelty and vicious supremacist bent of Rand's beliefs.

But over those many years, the coterie of Rand true believers has kept the faith and expanded. One of the things I value about Duggan's compelling account is her willingness to take seriously the far reach of Rand's indifference to human suffering even as she strips away the veneer that suggests Rand's beliefs were deep.

In fact, though her views are deeply-seated, Rand is, at heart, a confidence artist, appealing only to narrow self-interest at the expense of the well-being of whole societies.

I learned that the hard way, but I learned it. Now I am recommending Duggan's wise book to others who seek to understand today's cultural and political moment in the United States and the rise of an ethic of indifference to anybody but the already affluent. Duggan is comfortable with complexity; most Randian champions or detractors are not.

[Jun 11, 2019] How to Hide an Empire: A History of the Greater United States by Daniel Immerwahr

Notable quotes:
"... No other book out there has the level of breadth on the history of US imperialism that this work provides. Even though it packs 400 pages of text (which might seem like a turnoff for non-academic readers), "How to Hide an Empire" is highly readable given Immerwhar's skills as a writer. Also, its length is part of what makes it awesome because it gives it the right amount of detail and scope. ..."
"... Alleging that US imperialism in its long evolution (which this book deciphers with poignancy) has had no bearing on the destinies of its once conquered populations is as fallacious as saying that the US is to blame for every single thing that happens in Native American communities, or in the Philippines, Puerto Rico, Guam, American Samoa, etc. Not everything that happens in these locations and among these populations is directly connected to US expansionism, but a great deal is. ..."
"... This is exactly the kind of book that drives the "My country, right or wrong" crowd crazy. Yes, slavery and genocide and ghastly scientific experiments existed before Europeans colonized the Americas, but it's also fair and accurate to say that Europeans made those forms of destruction into a bloody artform. Nobody did mass slaughter better. ..."
Feb 19, 2019 | www.amazon.com
4.6 out of 5 stars 50 customer reviews Reviews

Jose I. Fuste, February 25, 2019

5.0 out of 5 stars Comprehensive yet highly readable. A necessary and highly useful update.

I'm a professor at the University of California San Diego and I'm assigning this for a graduate class.

No other book out there has the level of breadth on the history of US imperialism that this work provides. Even though it packs 400 pages of text (which might seem like a turnoff for non-academic readers), "How to Hide an Empire" is highly readable given Immerwhar's skills as a writer. Also, its length is part of what makes it awesome because it gives it the right amount of detail and scope.

I could not disagree more with the person who gave this book one star. Take it from me: I've taught hundreds of college students who graduate among the best in their high school classes and they know close to nothing about the history of US settler colonialism, overseas imperialism, or US interventionism around the world. If you give University of California college students a quiz on where the US' overseas territories are, most who take it will fail (trust me, I've done it). And this is not their fault. Instead, it's a product of the US education system that fails to give students a nuanced and geographically comprehensive understanding of the oversized effect that their country has around our planet.

Alleging that US imperialism in its long evolution (which this book deciphers with poignancy) has had no bearing on the destinies of its once conquered populations is as fallacious as saying that the US is to blame for every single thing that happens in Native American communities, or in the Philippines, Puerto Rico, Guam, American Samoa, etc. Not everything that happens in these locations and among these populations is directly connected to US expansionism, but a great deal is.

A case in point is Puerto Rico's current fiscal and economic crisis. The island's political class share part of the blame for Puerto Rico's present rut. A lot of it is also due to unnatural (i.e. "natural" but human-exacerbated) disasters such as Hurricane María. However, there is no denying that the evolution of Puerto Rico's territorial status has generated a host of adverse economic conditions that US states (including an island state such as Hawaii) do not have to contend with. An association with the US has undoubtedly raised the floor of material conditions in these places, but it has also imposed an unjust glass ceiling that most people around the US either do not know about or continue to ignore.

To add to those unfair economic limitations, there are political injustices regarding the lack of representation in Congress, and in the case of Am. Samoa, their lack of US citizenship. The fact that the populations in the overseas territories can't make up their mind about what status they prefer is: a) understandable given the way they have been mistreated by the US government, and b) irrelevant because what really matters is what Congress decides to do with the US' far-flung colonies, and there is no indication that Congress wants to either fully annex them or let them go because neither would be convenient to the 50 states and the political parties that run them. Instead, the status quo of modern colonial indeterminacy is what works best for the most potent political and economic groups in the US mainland. Would

This book is about much more than that though. It's also a history of how and why the United States got to control so much of what happens around the world without creating additional formal colonies like the "territories" that exist in this legal limbo. Part of its goal is to show how precisely how US imperialism has been made to be more cost-effective and also more invisible.

Read Immerwhar's book, and don't listen to the apologists of US imperialism which is still an active force that contradicts the US' professed values and that needs to be actively dismantled. Their attempts at discrediting this important reflect a denialism of the US' imperial realities that has endured throughout the history that this book summarizes.

"How to Hide an Empire: A History of the Greater United States" is a great starting point for making the US public aware of the US' contradictions as an "empire of liberty" (a phrase once used by Thomas Jefferson to describe the US as it expanded westward beyond the original 13 colonies). It is also a necessary update to other books on this topic that are already out there, and it is likely to hold the reader's attention more given its crafty narrative prose and structure Read less 194 people found this helpful Helpful Comment Report abuse

David Robson, February 26, 2019
Why So Sensitive?

5.0 out of 5 stars Why So Sensitive?

This is exactly the kind of book that drives the "My country, right or wrong" crowd crazy. Yes, slavery and genocide and ghastly scientific experiments existed before Europeans colonized the Americas, but it's also fair and accurate to say that Europeans made those forms of destruction into a bloody artform. Nobody did mass slaughter better.

The author of this compelling book reveals a history unknown to many readers, and does so with first-hand accounts and deep historical analyses. You might ask why we can't put such things behind us. The simple answer: we've never fully grappled with these events before in an honest and open way. This book does the nation a service by peering behind the curtain and facing the sobering truth of how we came to be what we are.

Thomas W. Moloney, April 9, 2019
This is a stunning book, not to be missed.

5.0 out of 5 stars This is a stunning book, not to be missed.

This is a stunning book, not to be missed. If you finished Sapiens with the feeling your world view had greatly enlarged, you're likely to have the same experience of your view of the US from reading this engaging work. And like Sapiens, it's an entirely enjoyable read, full of delightful surprises, future dinner party gems.

The further you get into the book the more interesting and unexpected it becomes. You'll look at the US in ways you likely never considered before. This is not a 'political' book with an ax to grind or a single-party agenda. It's refreshingly insightful, beautifully written, fun to read.

This is a gift I'll give to many a good friend, I've just started with my wife. I rarely write reviews and have never met the author (now my only regret). 3 people found this helpful

P , May 17, 2019
Content is A+. Never gets boring/tedious; never lingers; well written. It is perfect. 10/10

4.0 out of 5 stars Content is A+. Never gets boring/tedious; never lingers; well written. It is perfect. 10/10

This book is an absolutely powerhouse, a must-read, and should be a part of every student's curriculum in this God forsaken country.

Strictly speaking, this brilliant read is focused on America's relationship with Empire. But like with nearly everything America, one cannot discuss it without discussing race and injustice.

If you read this book, you will learn a lot of new things about subjects that you thought you knew everything about. You will have your eyes opened. You will be exposed to the dark underbelly of racism, corruption, greed and exploitation that undergird American ambition.

I don't know exactly what else to say other than to say you MUST READ THIS BOOK. This isn't a partisan statement -- it's not like Democrats are any better than Republicans in this book.

This is one of the best books I've ever read, and I am a voracious reader. The content is A+. It never gets boring. It never gets tedious. It never lingers on narratives. It's extremely well written. It is, in short, perfect. And as such, 10/10.

Sunny May 11, 2019
Excellent and thoughtful discussion regarding the state of our union

5.0 out of 5 stars Excellent and thoughtful discussion regarding the state of our union

I heard an interview of Daniel Immerwahr on NPR news / WDET radio regarding this book.

I'm am quite conservative and only listen to NPR news when it doesn't lean too far to the left.

However, the interview piqued my interest. I am so glad I purchased this ebook. What a phenomenal and informative read!!! WOW!! It's a "I never knew that" kind of read. Certainly not anything I was taught in school. This is thoughtful, well written and an easy read. Highly recommend!!

[Jun 11, 2019] Globalists: The End of Empire and the Birth of Neoliberalism by Quinn Slobodian

The author is a very fuzzy way comes to the idea that neoliberalism is in essence a Trotskyism for the rich and that neoliberals want to use strong state to enforce the type of markets they want from above. That included free movement of capital goods and people across national borders. All this talk about "small government" is just a smoke screen for naive fools.
Similar to 1930th contemporary right-wing populism in Germany and Austria emerged from within neoliberalism, not in opposition to it. They essentially convert neoliberalism in "national liberalism": Yes to free trade by only on bilateral basis with a strict control of trade deficits. No to free migration, multilateralism
Notable quotes:
"... The second explanation was that neoliberal globalization made a small number of people very rich, and it was in the interest of those people to promote a self-serving ideology using their substantial means by funding think tanks and academic departments, lobbying congress, fighting what the Heritage Foundation calls "the war of ideas." Neoliberalism, then, was a restoration of class power after the odd, anomalous interval of the mid-century welfare state. ..."
"... Neoliberal globalism can be thought of in its own terms as a negative theology, contending that the world economy is sublime and ineffable with a small number of people having special insight and ability to craft institutions that will, as I put it, encase the sublime world economy. ..."
"... One of the big goals of my book is to show neoliberalism is one form of regulation among many rather than the big Other of regulation as such. ..."
"... I build here on the work of other historians and show how the demands in the United Nations by African, Asian, and Latin American nations for things like the Permanent Sovereignty over Natural Resources, i.e. the right to nationalize foreign-owned companies, often dismissed as merely rhetorical, were actually existentially frightening to global businesspeople. ..."
"... They drafted neoliberal intellectuals to do things like craft agreements that gave foreign corporations more rights than domestic actors and tried to figure out how to lock in what I call the "human right of capital flight" into binding international codes. I show how we can see the development of the WTO as largely a response to the fear of a planned -- and equal -- planet that many saw in the aspirations of the decolonizing world. ..."
"... The neoliberal insight of the 1930s was that the market would not take care of itself: what Wilhelm Röpke called a market police was an ongoing need in a world where people, whether out of atavistic drives or admirable humanitarian motives, kept trying to make the earth a more equal and just place. ..."
"... The culmination of these processes by the 1990s is a world economy that is less like a laissez-faire marketplace and more like a fortress, as ever more of the world's resources and ideas are regulated through transnational legal instruments. ..."
Mar 16, 2018 | www.amazon.com

Hardcover: 400 pages
Publisher: Harvard University Press (March 16, 2018)
Language: English
ISBN-10: 0674979524
ISBN-13: 978-0674979529

From introduction

...The second explanation was that neoliberal globalization made a small number of people very rich, and it was in the interest of those people to promote a self-serving ideology using their substantial means by funding think tanks and academic departments, lobbying congress, fighting what the Heritage Foundation calls "the war of ideas." Neoliberalism, then, was a restoration of class power after the odd, anomalous interval of the mid-century welfare state.

There is truth to both of these explanations. Both presuppose a kind of materialist explanation of history with which I have no problem. In my book, though, I take another approach. What I found is that we could not understand the inner logic of something like the WTO without considering the whole history of the twentieth century. What I also discovered is that some of the members of the neoliberal movement from the 1930s onward, including Friedrich Hayek and Ludwig von Mises, did not use either of the explanations I just mentioned. They actually didn't say that economic growth excuses everything. One of the peculiar things about Hayek, in particular, is that he didn't believe in using aggregates like GDP -- the very measurements that we need to even say what growth is.

What I found is that neoliberalism as a philosophy is less a doctrine of economics than a doctrine of ordering -- of creating the institutions that provide for the reproduction of the totality [of financial elite control of the state]. At the core of the strain I describe is not the idea that we can quantify, count, price, buy and sell every last aspect of human existence. Actually, here it gets quite mystical. The Austrian and German School of neoliberals in particular believe in a kind of invisible world economy that cannot be captured in numbers and figures but always escapes human comprehension.

After all, if you can see something, you can plan it. Because of the very limits to our knowledge, we have to default to ironclad rules and not try to pursue something as radical as social justice, redistribution, or collective transformation. In a globalized world, we must give ourselves over to the forces of the market, or the whole thing will stop working.

So this is quite a different version of neoliberal thought than the one we usually have, premised on the abstract of individual liberty or the freedom to choose. Here one is free to choose but only within a limited range of options left after responding to the global forces of the market.

One of the core arguments of my book is that we can only understand the internal coherence of neoliberalism if we see it as a doctrine as concerned with the whole as the individual. Neoliberal globalism can be thought of in its own terms as a negative theology, contending that the world economy is sublime and ineffable with a small number of people having special insight and ability to craft institutions that will, as I put it, encase the sublime world economy.

To me, the metaphor of encasement makes much more sense than the usual idea of markets set free, liberated or unfettered. How can it be that in an era of proliferating third party arbitration courts, international investment law, trade treaties and regulation that we talk about "unfettered markets"? One of the big goals of my book is to show neoliberalism is one form of regulation among many rather than the big Other of regulation as such.

What I explore in Globalists is how we can think of the WTO as the latest in a long series of institutional fixes proposed for the problem of emergent nationalism and what neoliberals see as the confusion between sovereignty -- ruling a country -- and ownership -- owning the property within it.

I build here on the work of other historians and show how the demands in the United Nations by African, Asian, and Latin American nations for things like the Permanent Sovereignty over Natural Resources, i.e. the right to nationalize foreign-owned companies, often dismissed as merely rhetorical, were actually existentially frightening to global businesspeople.

They drafted neoliberal intellectuals to do things like craft agreements that gave foreign corporations more rights than domestic actors and tried to figure out how to lock in what I call the "human right of capital flight" into binding international codes. I show how we can see the development of the WTO as largely a response to the fear of a planned -- and equal -- planet that many saw in the aspirations of the decolonizing world.

Perhaps the lasting image of globalization that the book leaves is that world capitalism has produced a doubled world -- a world of imperium (the world of states) and a world of dominium (the world of property). The best way to understand neoliberal globalism as a project is that it sees its task as the never-ending maintenance of this division. The neoliberal insight of the 1930s was that the market would not take care of itself: what Wilhelm Röpke called a market police was an ongoing need in a world where people, whether out of atavistic drives or admirable humanitarian motives, kept trying to make the earth a more equal and just place.

The culmination of these processes by the 1990s is a world economy that is less like a laissez-faire marketplace and more like a fortress, as ever more of the world's resources and ideas are regulated through transnational legal instruments. The book acts as a kind of field guide to these institutions and, in the process, hopefully recasts the 20th century that produced them.


Mark bennett

One half of a decent book

3.0 out of 5 stars One half of a decent book May 14, 2018 Format: Hardcover Verified Purchase This is a rather interesting look at the political and economic ideas of a circle of important economists, including Hayek and von Mises, over the course of the last century. He shows rather convincingly that conventional narratives concerning their idea are wrong. That they didn't believe in a weak state, didn't believe in the laissez-faire capitalism or believe in the power of the market. That they saw mass democracy as a threat to vested economic interests.

The core beliefs of these people was in a world where money, labor and products could flow across borders without any limit. Their vision was to remove these subjects (tariffs, immigration and controls on the movement of money) from the control of the democracy-based nation-state and instead vesting them in international organizations. International organizations which were by their nature undemocratic and beyond the influence of democracy. That rather than rejecting government power, what they rejected was national government power. They wanted weak national governments but at the same time strong undemocratic international organizations which would gain the powers taken from the state.

The other thing that characterized many of these people was a rather general rejection of economics. While some of them are (at least in theory) economists, they rejected the basic ideas of economic analysis and economic policy. The economy, to them, was a mystical thing beyond any human understanding or ability to influence in a positive way. Their only real belief was in "bigness". The larger the market for labor and goods, the more economically prosperous everyone would become. A unregulated "global" market with specialization across borders and free migration of labor being the ultimate system.

The author shows how, over a period extending from the 1920s to the 1990s, these ideas evolved from marginal academic ideas to being dominant ideas internationally. Ideas that are reflected today in the structure of the European Union, the WTO (World Trade Organization) and the policies of most national governments. These ideas, which the author calls "neoliberalism", have today become almost assumptions beyond challenge. And even more strangely, the dominating ideas of the political left in most of the west.

The author makes the point, though in a weak way, that the "fathers" of neoliberalism saw themselves as "restoring" a lost golden age. That golden age being (roughly) the age of the original industrial revolution (the second half of the 1800s). And to the extent that they have been successful they have done that. But at the same time, they have brought back all the political and economic questions of that era as well.

In reading it, I started to wonder about the differences between modern neoliberalism and the liberal political movement during the industrial revolution. I really began to wonder about the actual motives of "reform" liberals in that era. Were they genuinely interested in reforms during that era or were all the reforms just cynical politics designed to enhance business power at the expense of other vested interests. Was, in particular, the liberal interest in political reform and franchise expansion a genuine move toward political democracy or simply a temporary ploy to increase their political power. If one assumes that the true principles of classic liberalism were always free trade, free migration of labor and removing the power to governments to impact business, perhaps its collapse around the time of the first world war is easier to understand.

He also makes a good point about the EEC and the organizations that came before the EU. Those organizations were as much about protecting trade between Europe and former European colonial possessions as they were anything to do with trade within Europe.

To me at least, the analysis of the author was rather original. In particular, he did an excellent job of showing how the ideas of Hayek and von Mises have been distorted and misunderstood in the mainstream. He was able to show what their ideas were and how they relate to contemporary problems of government and democracy.

But there are some strong negatives in the book. The author offers up a complete virtue signaling chapter to prove how the neoliberals are racists. He brings up things, like the John Birch Society, that have nothing to do with the book. He unleashes a whole lot of venom directed at American conservatives and republicans mostly set against a 1960s backdrop. He does all this in a bad purpose: to claim that the Kennedy Administration was somehow a continuation of the new deal rather than a step toward neoliberalism. His blindness and modern political partisanship extended backward into history does substantial damage to his argument in the book. He also spends an inordinate amount of time on the political issues of South Africa which also adds nothing to the argument of the book. His whole chapter on racism is an elaborate strawman all held together by Ropke. He also spends a large amount of time grinding some sort of Ax with regard to the National Review and William F. Buckley.

He keeps resorting to the simple formula of finding something racist said or written by Ropke....and then inferring that anyone who quoted or had anything to do with Ropke shared his ideas and was also a racist. The whole point of the exercise seems to be to avoid any analysis of how the democratic party (and the political left) drifted over the decades from the politics of the New Deal to neoliberal Clintonism.

Then after that, he diverts further off the path by spending many pages on the greatness of the "global south", the G77 and the New International Economic Order (NIEO) promoted by the UN in the 1970s. And whatever many faults of neoliberalism, Quinn Slobodian ends up standing for a worse set of ideas: International Price controls, economic "reparations", nationalization, international trade subsidies and a five-year plan for the world (socialist style economic planning at a global level). In attaching himself to these particular ideas, he kills his own book. The premise of the book and his argument was very strong at first. But by around p. 220, its become a throwback political tract in favor of the garbage economic and political ideas of the so-called third world circa 1974 complete with 70's style extensive quotations from "Senegalese jurists"

Once the political agenda comes out, he just can't help himself. He opens the conclusion to the book taking another cheap shot for no clear reason at William F. Buckley. He spends alot of time on the Seattle anti-WTO protests from the 1990s. But he has NOTHING to say about BIll Clinton or Tony Blair or EU expansion or Obama or even the 2008 economic crisis for that matter. Inexplicably for a book written in 2018, the content of the book seems to end in the year 2000.

I'm giving it three stars for the first 150 pages which was decent work. The second half rates zero stars. Though it could have been far better if he had written his history of neoliberalism in the context of the counter-narrative of Keynesian economics and its decline. It would have been better yet if the author had the courage to talk about the transformation of the parties of the left and their complicity in the rise of neoliberalism. The author also tends to waste lots of pages repeating himself or worse telling you what he is going to say next. One would have expected a better standard of editing by the Harvard Press. Read less 69 people found this helpful Helpful Comment Report abuse

Jesper Doepping
A concise definition of neoliberalism and its historical influence

5.0 out of 5 stars A concise definition of neoliberalism and its historical influence November 14, 2018 Format: Kindle Edition Verified Purchase Anybody interested in global trade, business, human rights or democracy today should read this book.

The book follow the Austrians from the beginning in the Habsburgischer empire to the beginning rebellion against the WTO. However, most importantly it follows the thinking and the thoughts behind the building of a global empire of capitalism with free trade, capital and rights. All the way to the new "human right" to trade. It narrows down what neoliberal thought really consist of and indirectly make a differentiation to the neoclassical economic tradition.

What I found most interesting is the turn from economics to law - and the conceptual distinctions between the genes, tradition, reason, which are translated into a quest for a rational and reason based protection of dominium (the rule of property) against the overreach of imperium (the rule of states/people). This distinction speaks directly to the issues that EU is currently facing.

Jackal
A historian with an agenda

3.0 out of 5 stars A historian with an agenda October 22, 2018 Format: Hardcover Author is covering Mises, Hayek, Machlup in Vienna. How to produce order once the Habsburg empire had been broken after 1918? They pioneered data gathering about the economy. However, such data came to be used by the left as well. This forced the people mentioned to become intellectual thinkers as opposed to something else(??). I like how the author is situating the people in a specific era, but he is reading history backwards. The book moves on, but stays in Central Europe. Ordocapitalism followed after Hitler. It was a German attempt to have a both strong state and strong by market, which given Europe's fragmentation required international treaties. This was seen as a way to avoid another Hitler. Later, international organisations like IMF and TWO became the new institutions that embedded the global markets. The book ends in the 90s. So in reading history backwards, the author finds quotations of Mises and Hayek that "prove" that they were aiming to create intellectual cover for the global financial elite of the 2010s.

Nevertheless, the book is interesting if you like the history of ideas. He frames the questions intelligently in the historical context at the time. However a huge question-mark for objectivity. The book is full of lefty dog whistles: the war making state, regulation of capitalism, reproducing the power of elites, the problem [singular] of capitalism. In a podcast the author states point blank "I wanted the left to see what the enemy was up too". I find it pathetic that authors are so blatantly partisan. How can we know whether he is objective when he doesn't even try? He dismissively claims that the neoliberal thinkers gave cover to what has become the globalist world order. So why should we not consider the current book as intellectual cover for some "new left" that is about to materialise? Maybe the book is just intellectual cover for the globalist elite being educated in left-wing private colleges.

[Jun 11, 2019] American Exceptionalism and American Innocence A People's History of Fake News_From the Revolutionary War to the War on Terror

Jun 11, 2019 | www.amazon.com

Did the U.S. really "save the world" in World War II? Should black athletes stop protesting and show more gratitude for what America has done for them? Are wars fought to spread freedom and democracy? Or is this all fake news?

American Exceptionalism and American Innocence examines the stories we're told that lead us to think that the U.S. is a force for good in the world, regardless of slavery, the genocide of indigenous people, and the more than a century's worth of imperialist war that the U.S. has wrought on the planet.

Sirvent and Haiphong detail just what Captain America's shield tells us about the pretensions of U.S. foreign policy, how Angelina Jolie and Bill Gates engage in humanitarian imperialism, and why the Broadway musical Hamilton is a monument to white supremacy.

====

Like a thunderbolt that penetrates the dark fog of ideological confusion, American Exceptionalism and American Innocence: A People's History of Fake News -- From the Revolutionary War to the War on Terror , illuminates the hidden spaces of the official story of the territory that came to be known as the United States of America.

Meticulously researched, American Exceptionalism and American Innocence utilizes a de-colonial lens that debunks the distorted, mythological liberal framework that rationalized the U.S. settler-colonial project. The de-colonized frame allows them to critically root their analysis in the psychosocial history, culture, political economy, and evolving institutions of the United States of America without falling prey to the unrecognized and unacknowledged liberalism and national chauvinism that seeps through so much of what is advanced as radical analysis today.

That is what makes this work so "exceptional" and so valuable at this moment of institutional and ideological crisis in the U.S. This crisis is indeed more severe and potentially more transformative than at any other moment in this nation's history.

With unflinching clarity, Sirvent and Haiphong go right to the heart of the current social, political, economic, and ideological crisis. They strip away the obscurantist nonsense pushed by liberal and state propagandists that the Trump phenomenon represents a fundamental departure from traditional "American values" by demonstrating that "Trumpism" is no departure at all, but only the unfiltered contemporary and particular expression of the core values that the nation was "founded" on.

What Sirvent and Haiphong expose in their work is that American exceptionalism and its corollary American innocence are the interconnected frames that not only explain why the crude white nationalism of a Donald Trump is consistent with the violence and white supremacy of the American experience, but also why that violence has been largely supported by large sections of the U.S. population repeatedly.

As the exceptional nation, the indispensable nation, the term President Obama liked to evoke to give humanitarian cover to the multiple interventions,

destabilization campaigns, and unilateral global policing operations on behalf of U.S. and international capital, it is expected and largely accepted by the citizens of the U.S. that their nation-state has a right and, actually, a moral duty to do whatever it deems appropriate to uphold the international order. It can do that because this cause is noble and righteous. Lest we forget the words of Theodore Roosevelt, considered a great architect of American progressiveness, "If given the choice between righteousness and peace, I choose righteousness."

In a succinct and penetrating observation, Sirvent and Haiphong point out:

American exceptionalism has always presumed national innocence despite imposing centuries of war and plunder. The American nation-state has been at war for over ninety percent of its existence. These wars have all been justified as necessary ventures meant to defend or expand America's so-called founding values and beliefs. A consequence of centuries of endless war has been the historical tendency of the U.S. to erase from consciousness the realities that surround American domestic and international policy, not to mention the system of imperialism that governs both.

But the acceptance of state violence in the form of economic sanctions and direct and indirect military interventions is not the only consequence of the cultural conditioning process informed by the arrogance of white privilege, white rights, and the protection of white Western civilization. The racist xenophobia, impunity for killer-cops, mass incarceration, ICE raids and checkpoints, left-right ideological convergence to erase "blackness," are all part of the racial management process that still enjoys majoritarian support in the U.S.

American Exceptionalism and American Innocence 's focus on the insidious and corrosive impact of white supremacy throughout the book is a necessary and valuable corrective to the growing tendency toward marginalizing the issue of race, even among left forces under the guise of being opposed to so-called identity politics.

Centering the role of white supremacist ideologies and its connection to American exceptionalism and innocence, Sirvent and Haiphong argue that "communities and activists will be better positioned to dismantle them." American exceptionalism and notions of U.S. innocence not only provide

ideological rationalizations for colonialism, capitalism, empire, and white supremacy, but also a normalized theoretical framework for how the world is and should be structured that inevitably makes criminals out of the people opposing U.S. dominance, within the nation and abroad.

Paul Krugman, a leading liberal within the context of the U.S. articulates this normalized framework that is shared across the ideological spectrum from liberal to conservative and even among some left forces. I have previously referred to this view of the world as representative of the psychopathology of white supremacy:

"We emerged from World War II with a level of both economic and military dominance not seen since the heyday of ancient Rome. But our role in the world was always about more than money and guns. It was also about ideals: America stood for something larger than itself -- for freedom, human rights and the rule of law as universal principles . . . By the end of World War II, we and our British allies had in effect conquered a large part of the world. We could have become permanent occupiers, and/or installed subservient puppet governments, the way the Soviet Union did in Eastern Europe. And yes, we did do that in some developing countries; our history with, say, Iran is not at all pretty. But what we mainly did instead was help defeated enemies get back on their feet, establishing democratic regimes that shared our core values and became allies in protecting those values. The Pax Americana was a sort of empire; certainly America was for a long time very much first among equals. But it was by historical standards a remarkably benign empire, held together by soft power and respect rather than force." 1

American Exceptionalism and American Innocence refutes this pathological view of the U.S. and demonstrates that this view is a luxury that the colonized peoples of the world cannot afford.

The bullet and the bomb -- the American military occupation and the police occupation -- are the bonds that link the condition of Black Americans to oppressed nations around the world. This is the urgency in which the authors approached their task. The physical and ideological war being waged against the victims of the colonial/capitalist white supremacist patriarchy is resulting in real suffering. Authentic solidarity with the oppressed requires a

rejection of obfuscation. The state intends to secure itself and the ruling elite by legal or illegal means, by manipulating or completely jettisoning human freedom and democratic rights. Sirvent and Haiphong know that time is running out. They demonstrate the intricate collaboration between the state and the corporate and financial elite to create the conditions in which ideological and political opposition would be rendered criminal as the state grapples with the legitimacy crisis it finds itself in. They know that Trump's "make America great again" is the Republican version of Obama's heralding of U.S. exceptionalism, and that both are laying the ideological foundation for a cross-class white neofascist solution to the crisis of neoliberal capitalism.

The U.S. is well on its way toward a new form of totalitarianism that is more widespread than the forms of neofascist rule that was the norm in the Southern states of the U.S. from 1878 to 1965. Chris Hedges refers to it as "corporate totalitarianism." And unlike the sheer social terror experienced by the African American population as a result of the corporatist alignment of the new Democratic party and national and regional capital in the South, this "new" form of totalitarianism is more benign but perhaps even more insidious because the control rests on the ability to control thought. And here lies the challenge. Marxist thinker Fredrick Jamison shares a very simple lesson, "The lesson is this, and it is a lesson about system: one cannot change anything without changing everything." This simple theory of system change argues that when you change one part of a system you by necessity must change all parts of the system, because all parts are interrelated.

The failure of the Western left in general and the U.S. left in particular to understand the inextricable, structural connection between empire, colonization, capitalism, and white supremacy -- and that all elements of that oppressive structure must be confronted, dismantled, and defeated -- continues to give lifeblood to a system that is ready to sweep into the dustbins of history. This is why American Exceptionalism and American Innocence is nothing more than an abject subversion. It destabilizes the hegemonic assumptions and imposed conceptual frameworks of bourgeois liberalism and points the reader toward the inevitable conclusion that U.S. society in its present form poses an existential threat to global humanity.

Challenging the reader to rethink the history of the U.S. and to imagine a future, decolonial nation in whatever form it might take, Sirvent and Haiphong include a quote from Indigenous rights supporter Andrea Smith

that captures both the subversive and optimistic character of their book. Smith is quoted saying:

Rather than a pursuit of life, liberty, and happiness that depends on the deaths of others . . . we can imagine new forms of governance based on the principles of mutuality, interdependence, and equality. When we do not presume that the United States should or will continue to exist, we can begin to imagine more than a kinder, gentler settler state founded on genocide and slavery.

American Exceptionalism and American Innocence gives us a weapon to reimagine a transformed U.S. nation, but it also surfaces the ideological minefields that we must avoid if we are to realize a new possibility and a new people.

<img src="https://images-na.ssl-images-amazon.com/images/S/amazon-avatars-global/default._CR0,0,1024,1024_SX48_.png"> John , May 26, 2019

Great Reading, But Some Omissions

I thought the book was great. However, key events were not discussed. One of the first deployed American expeditionary forces to bless the world was the establishment of treaty ports in China. These new American foreign beachheads in the Middle Kingdom came about as a result of Western ambitions to take them over as new colonial owners and led to one of the most ruinous periods in world history. Europe and the U.S. saturated the country with opium, leaving many Chinese stoned. This resulted in the destabilization of China, invasion of the brutal Japanese and the rise of Mao. Result- millions upon millions of people died because of American exceptionalism. It has taken China the last thirty years to recover from the disasters. Naturally, Trump & Co are not aware of this history or are unconcerned. However, the Chinese have not forgotten and routinely warn Team Trump they will not be bullied by foreigners again. Washington elites are ignorant at everyone's peril who want peace. Footnote - American exceptionalists Roosevelt, Kerry, Forbes, etc., got their wealth the old fashion way - by becoming drug kingpins to China.

The other big omission was World War I and especially its aftermath. Lauded by the French and saving European imperialism, returning African-American soldiers found themselves being singled out for extra harsh Jim Crow treatment -- they were too uppity & refused to follow old social norms. Several Black vets were tortured and hung while in uniform because they were bringing back a new message from European trenches - equal treatment. They were also exemplary in defending other Black citizens from White mob ambushes.

Had the authors covered the WWI aftermath, they would have also had to critique in greater detail the media. What they would have had to expose the media was never a friend to African-Americans, which holds to this day. The media was and is consistent with aligning with white elite interests. When Blacks rose up against bad treatment, the media always presented the white point of view. In fact, every white institution was engaged in this biased practice.

The Espionage Act also put a chill on labor organizing post WWI. Indeed, elites were quick to call any Black unrest as seditious and labelled some leaders such as W.E.B Dubois, Bolshevik inspired and should have been brought up on charges. This was the beginning of the linking of Black activism to the Kremlin, long before McCarthyism, COINTELPRO and Black Identity Extremist government labels.

[Jun 08, 2019] The Looting Machine Warlords, Oligarchs, Corporations, Smugglers, and the Theft of Africa's Wealth Tom Burgis 9781610397117

Jun 08, 2019 | www.amazon.com

The trade in oil, gas, gems, metals and rare earth minerals wreaks havoc in Africa. During the years when Brazil, India, China and the other "emerging markets" have transformed their economies, Africa's resource states remained tethered to the bottom of the industrial supply chain. While Africa accounts for about 30 per cent of the world's reserves of hydrocarbons and minerals and 14 per cent of the world's population, its share of global manufacturing stood in 2011 exactly where it stood in 2000: at 1 percent.

[Jun 08, 2019] The End of Oil On the Edge of a Perilous New World Paul Roberts 9780618562114 Amazon.com Books

Jun 08, 2019 | www.amazon.com

Christopher R , July 10, 2008

Makes analysis of the contemporary energy order accessible.

When I decided to read this book, I did so with the expectation of learning something only after wading through a great degree of partisan political rhetoric. It did not take me long to realize that Mr. Roberts' book is not what I had expected.

He makes this complex issue accessible to the layman looking to familiarize himself with not only oil, but the energy economy. Rather choose a side and engage in partisan sniping, he tells the good, the bad, and the ugly of the policies advocated by every party involved in the energy debate. Not only does he analyze our present situation, but he also studies our several possible ways forward into a new energy economy.

If I were pressed to make a complaint, it would be that I read the original hardcover edition of the book. A lot of the speculation regarding "worst case" scenarios involve $50 a barrel oil. Now that we are nearly $100 past that worst case, the educated speculation portrayed in the book should be coming to pass in the market. I would like to see either a completely updated 2008 edition or at least one with an updated preface.

John A. Leraas , September 26, 2015
Most informative, well written

A prequel to "The End of Food", this is a most informative book that discusses our dependence on oil; its history, its politics and its economics. After reading this piece there is much that is more easily understood. Much of international politics and economics is more clear. The development of new energy sources and their tardiness, and the dependence of many sectors of the economy on oil is more transparent.

Roberts' sequel, "The End of Food" is highly recommended after you read this book as the interdependence of these two great industries is amazing.

Larry B. Woodroof , January 19, 2006
An outstanding review of the current situation

Paul Roberts does an excellent job in not only telling about the coming troubles with oil, but doing so with an, at times, humorous style.

He makes no assumptions about the reader's knowledge, and spends the first part of the book explaining how the world got to be in this mess we are in, by deliniating the different energy eras throughout human history.

Common themes arise, in each era, and they combine to help the reader gain a perspective upon why things are they way they are.

Mr. Roberts did his research well, with an extensive foot note and bibliography section, yet in the course of this research he did more than just peruse reports and other books on the matter. He managed to gain access to the indutry leaders, talking and touring the facilties of the Russians and the Saudis.

If there is any fault, it is that the last chapeters of the book, wherein he extrapolates from his knowledge and research what he forsees occuring, seems a little less well developed than the earlier chapters. True, they are based upon fact and not prgnostication, but the writing seems at times rushed, and not up to the level of some of the earlier chapters.

Regardless, this is a book that I highly recommend reading, and is one that I have bought extra copies of for insertion into my "lending library" of books I share and recommend to friends.

<img src="https://images-na.ssl-images-amazon.com/images/S/amazon-avatars-global/default._CR0,0,1024,1024_SX48_.png"> Dalton C. Rocha , March 26, 2009
Good, but fails about Brazil, biofuels and nuclear power

I read this good book, here in Brazil.This book has many excellent parts.To example, about Hirohito, on page 39, this tells the true:Hirohito was Japan's Hitler and ordered the attack to Pearl Harbor, China and rest of Asia.
On page 176, this book tell that more than 90% of new power plants in the USA burn gas.About american culture, the page 263 has writen:"By contrast, although car manufactures offer more than thirty car models with with fuel economy of thirty miles for gallon or better, the ten most fuel-efficient models sold in the United States make up just 2 percent of the sales."
Americans love the SUVs, but to combat the blood of islamic terrorism, the petro-dollars, has no place in american hearts.
About the corrupt and also supporter of terrorism Saudi Arabia, this book is correct.

**************************************************************
This book is weak, when forgets Brazil, that only on page 56 is remebered only one time, without no detail at all.I don't agree, with this failure only because I'm a brazilian, but also because Brazil is among the world's leaders in oil reserves.See to example, the site [...] to read about this fact.
About nuclear energy, this book is very weak.On part III, there's talks about replacement of coal and gas for electric energy,but there's nothing about the fact that France, more than 20 years ago, closed all its coal and gas power plants an replaced all of them for nuclear power plants.
About ethanol, there's almost nothing.Only on page 340, ethanol is remebered, without any detail.I'm an agronomist and I think that biofuels are the answer for oil , at least on transportation.My family uses ethanol cars for more than 25 years, without no problem.

<img src="https://images-na.ssl-images-amazon.com/images/S/amazon-avatars-global/default._CR0,0,1024,1024_SX48_.png"> Roger Brown , September 23, 2007
Fair minded and objective overview of big energy

Very readable....Roberts does an excellent job of presenting opinions fairly and from many pro/contra angles. He has fully immersed himself in his topic and the book is chocked-full of fascinating energy facts.

What to do about our energy future has become as politically polarized as abortion - Conservatives favor fossil fuels and the Moderate - Liberal folks want to go Renewable.

Roberts is bare-knuckled about what he feels the agendas are behind the current debate, which leads him to a (slightly) reserved pessimism about our chances of making it out of the mess we've made, by putting all our energy eggs in one basket. He does not hide his contempt for later-day politicians who can't see the forest for the trees and won't take action to avert the coming energy drought.

[Jun 08, 2019] Washington s Dark Secret The Real Truth about Terrorism and Islamic Extremism John Maszka

Notable quotes:
"... "A century after World War I, the great war for oil is still raging, with many of the same fronts as before and also a few new ones. Throughout it all -- whether waged by realists, neoliberals, or neocons -- war has been extremely good for business" (225). ..."
Jun 08, 2019 | www.amazon.com

Anna Faktorovich , December 17, 2018

The War for Oil and the New Holocaust

The premise of this book is to say what most of the world's public has probably been thinking since the War on Terror began, or that it is a "war for natural resources -- and that terrorism has little to do with it. Once the military became mechanized, oil quickly became the most sought-after commodity on the planet, and the race for energy was eventually framed as a matter of national security."

John Maszka argues that the "oil conglomerates" are the real "threats to national security". Demonizing "an entire religion" is a repercussion of this policy. My own research in Rebellion as Genre a few years ago also attempted to point out the misuse of the term terrorism in its current application, or as a weapon against one's enemies rather than as a reference to a type of attacks intended to terrorize. Governments that accuse others of terrorism while legitimizing their own "acts of violence" as "retributive" are clearly breaking human rights agreements and their stated commitments to freedom.

Maszka's perspective is of particular interest because he teaches this subject at the Higher Colleges of Technology in Abu Dhabi, and has published widely his criticisms of the War on Terror, including Terrorism and the Bush Doctrine.

Many of the books I have read on terrorism from American supporters of this pro-War on Terror doctrine are troubling in their references to spreading Christianity and other similarly questionable ideologies, so it is refreshing to hear from somebody with a fresh perspective that is more likely to bring about world peace. The preface acknowledges that this book contrasts with the bulk of other books in this field. It also explains that it focuses primarily on two "Islamic militant organizations -- al-Qaeda and the Islamic State".

He explains that perception has a lot to do with who a country is willing to commit violence against, giving the example of Nazis being able to commit violence on Jews in the Holocaust because of this blindness. Thus, violence against Muslims by the West in the past two decade is shown as possibly a new Holocaust where the militaries are carrying out orders because Muslims have been demonized.

Terrorism has historically been the work of a few extremists, or terms like "war" or "revolution" is employed to describe large groups of such fighters; so it is strange that the West has entered the War on Terror with entire Muslim-majority countries, killing so many civilians that it is not a stretch to call these Holocaust-like.

The Islamic State targets Muslims as well, also showing dehumanized traits that are even harder to explain (x-xi). The preface also acknowledges that the author will be using "contractions and anecdotal digressions" as "intentional literary devices", shooing the standard scholarly style (this is troubling for me personally, as I'm allergic to digressions, but at least he tells readers what to expect).

As promised, Chapter One begins with a poet's story about the Tree of Life, then discusses the Boston Marathon bombings from the perspective of the author as he worked in Kyrgyzstan, and goes off on other tangents before reaching this conclusion -- the marathon's bombers were not terrorists: "They had no political aspirations. They weren't attempting to obtain concessions from the government or provoke a reaction. They simply believed that they were 'wave sheaves' -- first fruits of God -- and that they would be instrumental in ushering in the apocalypse" (5).

This conclusion explains the relationship between all of the digressions across this section, so these digressions were necessary to prove this point, and thus are suitable for a scholarly book. And this is exactly the type of logical reasoning that is missing in most of the oratory on terrorism. The entire book similarly uses specific acts of supposed terrorism to explain what really happened and working to understand th motivations of the actors.

Since the author's digressions into his own life are typically very relevant to the subject, they are definitely helpful: "I was stationed in Riyadh at an American military base that was attacked by an al-Qaeda suicide bomber" (135).

It would actually be unethical if Maszka did not explain that he has been personally affected by al-Qaeda in this context; and since he has seen this War as a civilian living in the affected countries and as a member of the military that is attaching these "terrorists", his opinions should be trustworthy for both sides. Given how emotional writing this book with detachment and carefully crafted research must have been for somebody who has been bombed, it is only fitting that the final chapter is called, "The Definition of Insanity."

And here is the final chapter:

"A century after World War I, the great war for oil is still raging, with many of the same fronts as before and also a few new ones. Throughout it all -- whether waged by realists, neoliberals, or neocons -- war has been extremely good for business" (225).

Very powerful words that are justly supported. I would strongly recommend that everybody in the West's militaries who is responsible for making decisions in the War on Terror read this book before they make their next decision. Who are they shooting at? Why? Who is benefiting? Who is dying? Are they committing war crimes as serious as the Nazis? If there is any chance these allegations are true what kind of a military leader can proceed without understanding the explanations that Maszka offers here? This would probably also work well in an advanced graduate class, despite its digressions, it will probably help students write better dissertations on related topics.

Pennsylvania Literary Journal: Fall 2018

[Jun 08, 2019] The Party's Over Oil, War and the Fate of Industrial Societies by Richard Heinberg

Jun 08, 2019 | www.amazon.com

The world is about to run out of cheap oil and change dramatically. Within the next few years, global production will peak. Thereafter, even if industrial societies begin to switch to alternative energy sources, they will have less net energy each year to do all the work essential to the survival of complex societies. We are entering a new era, as different from the industrial era as the latter was from medieval times.

In The Party's Over , Richard Heinberg places this momentous transition in historical context, showing how industrialism arose from the harnessing of fossil fuels, how competition to control access to oil shaped the geopolitics of the twentieth century and how contention for dwindling energy resources in the twenty-first century will lead to resource wars in the Middle East, Central Asia and South America. He describes the likely impacts of oil depletion and all of the energy alternatives. Predicting chaos unless the United States -- the world's foremost oil consumer -- is willing to join with other countries to implement a global program of resource conservation and sharing, he also recommends a "managed collapse" that might make way for a slower-paced, low-energy, sustainable society in the future.

More readable than other accounts of this issue, with fuller discussion of the context, social implications and recommendations for personal, community, national and global action, Heinberg's updated book is a riveting wake-up call for human-kind as the oil era winds down, and a critical tool for understanding and influencing current US foreign policy.

Richard Heinberg , from Santa Rosa, California, has been writing about energy resources issues and the dynamics of cultural change for many years. A member of the core faculty at New College of California, he is an award-winning author of three previous books. His Museletter was nominated for the Best Alternative Newsletter award by Utne in 1993.


Laura Lea Evans , April 20, 2013

love and hate

Well, how to describe something that is so drastic in predictions as to make one quiver? Heinberg spells out a future for humans that is not very optimistic but sadly, is more accurate than any of us would like. The information and research done by the author is first rate and irrefutable, which is as it should be. The news: dire. This is my first in a series of his work and indeed, it's a love/hate experience since there is a lot of hopelessness in the outcome of our current path. Be that as it may, this is a book to cherish and an author to admire.

Scott Forbes , May 31, 2005
This book will make you think differently about energy

Surprizingly its not about the rising cost of the energy that you personally use. Its about the whole economy that has been built on using a non-replenishable energy supply. You know how those economists always count on the 3% growth in the GDP. Well the book argues that this long term growth is fundamentally driven by our long term growth in energy usage, which everyone knows will have to turn around at some point.

The other surprizing fact is that the turning point is long before you run out of oil. Heinberg shows data that indicates that half of the oil is still left in the ground when the returns start to diminish. And it appears that that we are within a few years of reaching that point.

So we've used up about half the "available" ( i.e. feasible to extract from an energy perspective ) oil. Now oil production starts to decrease. What happens next is anyone's guess, but Heinburg presents some detailed discussions on the possiblities. Don't assume that a coal, nuclear, or "hydrogen" economy are going to be as easy and profitable as the petroleum economy we are leaving behind.

I've read lots of books about energy and the environment, and this is definitely one of the best.

B. King , November 22, 2003
An Industrial Strength Critique of Energy Usage

Part history and part prophesy, this book is an outstanding summary of many major issues facing Western industrial society. Author Richard Heinberg provides a scholarly critique of modern industrialism, focusing on its current use of energy, and a sobering forecast based on predictable trends.

The key point of the book is that the Earth's crust can provide mankind with an essentially finite amount of fossil fuel energy, with primary reference to oil. Drawing on the relatively unknown, and oft-misunderstood, concept of "peak oil," the book addresses the imminent shortfall of petroleum that will not be available on world markets. That day of reckoning is far closer than most people think. "Peak oil" is a global application of Geologist M. King Hubbert's (1903-1989) studies of oil production in "mature" exploration districts. That is, exploration for oil in sedimentary basins at first yields substantial discoveries, which are then produced. Additional exploration yields less and less "new" oil discovered, and that level of discovery coming at greater and greater effort. Eventually, absent additional significant discovery, production "peaks" and then commences an irreversible decline.

This has already occurred in the U.S. in the 1970's, and is in the process of occurring in oil-producing nations such as Mexico, Britain, Egypt, Indonesia and Malaysia. Ominously, "peak" production can be forecast in the next few years in such significant producing nations as Saudi Arabia and Iraq (in addition to all of the other problems in those unfortunate nations.)

Much of the rise of industrial society was tied to increasing availability of high energy-density fuel, particularly oil. Western society, and its imitators in non-Western lands, is based upon access to large amounts of energy-dense fuel, and that fuel is oil. With respect to the U.S., the domestic decline in oil production has been made up, over the past thirty years, by increasing imports from other locales, with concomitant political risk. When the world production "peaks" in the next few years, the competition for energy sources will become more fierce than it already is. This book addresses issues related to what are commonly thought of as "substitutes" for oil, such as coal, natural gas and natural gas liquids, and shatters many myths. The author also delves deeply into energy sources such as "tar sand," "oil shale," nuclear and renewable sources. And thankfully, the author offers a number of proposals to address the looming problem (although these proposals are probably not what an awful lot of people want to hear.)

A book like this one could easily descend into a tawdry level of "chicken-little" squawks and utter tendentiousness. But thankfully it does not do so. This is a mature, well-reasoned and carefully footnoted effort. I could take issue with some of the author's points about "big business" and how decisions are made at high political levels, but not in this review. Instead I will simply congratulate Mr. Heinberg for writing an important contribution to social discourse. I hope that a lot of people read this book and start to look at and think about the world differently.

This Hippy Gorilla , July 19, 2006
Cogent, timely, largely ignored

Maybe the most important book since Charles Darwin's "The Origin of Species". This volume represents THE wakeup call for a world society quite literally addicted to crude oil for its continuation, and, in most cases, it's very survival.

Heinberg has done his homework, and this volume should be required reading for anyone in an industrialized nation, or one just getting started down that road. It is a proven scientific fact that within a few years, we will begin to run out of oil, and it will be pretty much gone within 5 or 6 decades. Considering that we have built our entire society around an oil economy, the implications are dire - far, far beyond not being able to drive through the coffee shop with the kids in your SUV on the way home from the mall. Alternative energy sources? Dream on - read on.

The book is thoroughly researched, well-thought and organized and presents the often dissenting views at every side of this hugely important issue. It is also delightfully written and composed, and is fun and quick to read.

I highly recommend this book, and I hope at least one person reads what I'm writing and buys this book. And I hope they tell someone, too.

[Jun 05, 2019] End of Discussion How the Left s Outrage Industry Shuts Down Debate, Manipulates Voters, and Makes America Less Free (and Fun)

Notable quotes:
"... This book covers our current inability to allow all voices to be heard. Key words like "racism " and "?-phobia" (add your preference) can and do end conversations before they begin ..."
"... Hate speech is now any speech about an idea that you disagree with. As we go down the road of drowning out some speech eventually no speech will be allowed. Finger pointers should think about the future, the future when they will be silenced. It's never wrong to listen to different point of view. That's called learning. ..."
"... A very clear and balanced portrait of the current political landscape where a "minority of one" can be supposedly damaged as a result of being exposed to "offensive" ideas. ..."
"... A well documented journey of the transformation from a time when people had vehement arguments into Orwell-Land where the damage one supposedly "suffers" simply from having to "hear" offensive words, allows this shrieking minority to not only silence those voices, but to destroy the lives of the people who have the gall to utter them. ..."
Aug 01, 2017 | www.amazon.com

Q Garcia , August 9, 2017

1984 is Here - Everybody's Brother is Watching

This book covers our current inability to allow all voices to be heard. Key words like "racism " and "?-phobia" (add your preference) can and do end conversations before they begin .

Hate speech is now any speech about an idea that you disagree with. As we go down the road of drowning out some speech eventually no speech will be allowed. Finger pointers should think about the future, the future when they will be silenced. It's never wrong to listen to different point of view. That's called learning.

.0 out of 5 stars A Professor's Review of the Outrage Circus (and the first non-Vine review :-)
Brumble Buffin , August 18, 2015
Tolerance gone astray

I became interested in this book after watching Megyn Kelly's interview with Benson (Google it), where he gave his thoughts on the SCOTUS decision to legalize same-sex marriage in all 50 states. He made a heartfelt and reasoned plea for tolerance and grace on BOTH sides. He hit it out of the park with this and set himself apart from some of his gay peers who are determined that tolerance is NOT a two-way street.

We are seeing a vindictive campaign of lawsuits and intimidation against Christian business people who choose not to provide flowers and cakes for same-sex weddings. The First Amendment says that Congress shall make no law prohibiting the free exercise of religion. Thumbing your nose at this core American freedom should alarm us all. Personally, I'm for traditional marriage and I think the better solution would be to give civil unions the same legal rights and obligations as marriage, but that's another discussion.

So what about the book? It exceeded my expectations. Ham and Benson are smart and articulate. Their ideas are clearly presented, supported by hard evidence and they are fair and balanced. The book is a pleasure to read - - unless you are a die-hard Lefty. In that case, it may anger you, but anger can be the first step to enlightenment.

Steve Bicker , August 1, 2015
A Well Documented Death of Debate

A very clear and balanced portrait of the current political landscape where a "minority of one" can be supposedly damaged as a result of being exposed to "offensive" ideas.

A well documented journey of the transformation from a time when people had vehement arguments into Orwell-Land where the damage one supposedly "suffers" simply from having to "hear" offensive words, allows this shrieking minority to not only silence those voices, but to destroy the lives of the people who have the gall to utter them.

The Left lays claim to being the "party of tolerance", unless you happen to "think outside THEIR box", which, to the Left is INtolerable and must not only be silenced, but exterminated... A great book!

[May 22, 2019] XINYUNG Fitness Tracker Smart Watch, Activity Tracker with Heart Rate Monitor, Waterproof Pedometer Watch with Slee

May 22, 2019 | www.amazon.com

Features:

Heart Rate Monitor

Sleep Monitor

IP67 Life Waterproof

Smart Notifications

Connected GPS

7 Sport Modes

3 Alarm clock

Sedentary reminder

Remote Camera Control

Custom Dial

Music Player Control

6 Brightness Level Adjustment

[May 21, 2019] Updated 2019 Version Fitness Tracker HR, Activity Trackers Health Exercise Watch with Heart Rate

May 21, 2019 | www.amazon.com

Comfortable band with a fairly robust app April 20, 2019 Color: Black Verified Purchase For a fitness tracker, this is fairly cheap and robust for the size and price. The tracker does basic readings of steps, pulse, blood pressure, sleep patterns, and physical activity. As far as I can tell, the tracker is fairly accurate in all of the above, however, given the limited routines for physical activity available, it might not be as accurate unless you stick to the regimens it readily provides. It's comfortable with a long battery life (7-8 days with light activity, 5-6 days if you're a particularly active person), and is fairly water-resistant to be worn in the shower.

The app itself (GloryFit) is a fairly robust app and helps portray meaningful metrics around activity and sleep patterns. Pairing and using the device is pretty easy - just be within range and select the device.

Much of the package leaves a bit more to be desired. Both the instructions provided and app are riddled with grammatical and spelling errors that might turn most people off, but if you can look past these, it's still a fairly good set. The setup with the main touchscreen button is a bit weird, as I'm used to more touchscreen interfaces instead of a singular button. In this case, the tracker uses the button to either cycle on short presses, or select on long presses. The last minor gripe is taking off the band to expose the charging port for the tracker, but the instructions have a fairly nice picture of how to do that.

Pros: comfortable; easy to use; low profile

Cons: directions/app provided are hard to read/interpret; not intuitive to use

Recommended Use: for those with a fairly sedentary lifestyle; monitoring of basic vitals

[May 13, 2019] Big Israel How Israel's Lobby Moves America by Grant F. Smith

Jewish lobby does not represent the views of the US Jewish population. They represent a small number of rich donors (concentration is just staggering) and of course Israeli government. Those organization are non-representative authoritarian bodies with a lot of functionaries serving for life or extremly long tenures.
Notable quotes:
"... One stunning example of this influence occurred recently. At one time during the nominating process for the Republican candidate for President in the current election, every single aspirant to the nomination made a pilgrimage to Las Vegas to kiss the money ring of Sheldon Adelson, whose only declared interests are Israel and online gambling. This is the same super-patriot Sheldon Adelson who wanted Mitt Romney to pardon Jonathan Pollard, should Romney become President with Adelson's financial backing. ..."
Feb 05, 2016 | www.amazon.com

The latest in the powerful series of titles written by Grant Smith. Highly recommended factual, documented and accessible data that should
be required reading for high school students as well as their parents.!

James Robinson , July 26, 2016

Would have been a tedious read for someone well acquainted with Israeli machinations

Superb compilation of organizations that receive tax exempt status in the US that work exclusively on behave of a foreign nation, Israel,often to the pronounced determent of the US interests and policies. Would have been a tedious read for someone well acquainted with Israeli machinations, but for someone new to the subject the anger that the revelations produce makes the reading of this book a disquietening breeze. Read more

Ronald Johnson , April 11, 2016
non-systematic conjecture about Zionism's amazing insider access to

Book Review of Big Israel, by Grant F. Smith

This is an important book, the latest from Grant F. Smith in the line of his previous investigations into what was referred to as, the "Zionist Occupied Government", an earlier, intuitive, non-systematic conjecture about Zionism's amazing insider access to, and influence of, U.S. foreign policy. It is interesting that Wikipedia describes the "ZOG" exclusively as an anti-semitic conspiracy theory attributed to a list of unsavory persons and racist organizations.

On the one hand, the American Israel Public Affairs Committee puts on a very public celebration every spring, the "policy conference", that is a pep rally of mandatory attendance by national Administration and Congressional people to celebrate Zionism. That event is public. But on the other hand, as Grant Smith analyzes, the "Israel Affinity Organizations" of the United States are a different archipelago.

As to what extent these organizations are legitimate lobbies, versus being mis-identified agents of a foreign power, I won't attempt to summarize, or, "give away" the content of the book; it is for people to read for themselves, to be informed, and to think for themselves.

Grant Smith presents numbers, names, and dates, to be reviewed and challenged by anyone who wants to. There is precedent for that. The USS Liberty attack by Israel was defended as a tragic mistake by author A. Jay Cristol, in his book, "The Liberty Incident". The Wiesenthal Center commissioned the author, Harold Brackman, to write, "Ministry of Lies, the Truth Behind the 'Secret Relationship Between Blacks and Jews' ". That referenced book was by the Nation of Islam. With facts in hand, the Electorate is empowered to make informed decisions about the US national interest, relative to Zionism.

Another good book is by Alison Weir on essentially the same subject, "Against Our Better Judgement, the Hidden History of How the U.S. Was Used to Create Israel". The Amazon servers, for that book review are loaded with discussions, which can be seen under that title. The Amazon book reviews are a valuable national resource that can be a place to survey public opinion, even with the factor that positives have less motivation than negatives to inspire writing an essay.

D L Neal , May 28, 2018
at least at this time- Wonderful, informative and important book

It is obvious why there is no middle ground in the reviews here, at least at this time- Wonderful, informative and important book. Read more One person found this helpful

Luther , May 15, 2016
"America. . . you can move very easily. . .." Netanyahu

No matter what your values -- Christian, Enlightenment, social justice, international law, natural law, the Kantian imperative, crimes against humanity, Judaism's own values (Israel "a light unto the nations" Isaiah 49:6) -- what Israel has done and is doing to the Palestinians is morally wrong.
Sure. People have done bad things to other people forever, but this evil is orchestrated by a large Zionist organization from all over the world. And the US is being made complicit in this immoral undertaking in the numerous ways Grant Smith explores in his book.

Exposing America's unfortunate entanglement is why he wrote this excellent book: 300 pages and 483 footnotes of support for the claims he makes.
The American democratic process is being corrupted at every level in the interests of Israel, and Smith gives chapter and verse on how this is being done.

One stunning example of this influence occurred recently. At one time during the nominating process for the Republican candidate for President in the current election, every single aspirant to the nomination made a pilgrimage to Las Vegas to kiss the money ring of Sheldon Adelson, whose only declared interests are Israel and online gambling. This is the same super-patriot Sheldon Adelson who wanted Mitt Romney to pardon Jonathan Pollard, should Romney become President with Adelson's financial backing.

In addition, Haim Saban of the Brookings Institution plays a similar role in the Democratic party. He has said: "I'm a one-issue guy, and my issue is Israel." He has promised to contribute as much money as needed to elect Hillary Clinton, someone who believes that Israel has a right to exist as a "Jewish state," with Jerusalem (an international city for millenia) as its capital (something no country in the world approves of, not even the USA).

  1. Is this the American democratic process in action?
  2. Is this what the Constitution intends?
  3. Is this our America?

Grant discusses in supported detail the areas of dual citizenship and dual loyalties (American citizens in the Israeli Defense Force); espionage (industrial and military); yearly billions to Israel with no benefit to the US; media control (no debating the facts of history; no Palestinians allowed to articulate and disseminate their narrative); tax exemption for money which goes to Jewish interests as well as the illegal settlements in Israel; perversion of education (forced Holocaust information but no discussion; anti-assimilation); foreign policy (the war with Iraq for the benefit of Israel; the demonization of Iran; no condemnation of Israel's nuclear capability in spite of the Non-Proliferation Treaty; use of the veto in the UN in Israel's interests; Middle East "regime change" wars); Israeli and Jewish influence in Congress (money, intense lobbying by AIPAC and free trips to Israel), and financial contributions only to candidates who are unequivocally pro-Israel, in some cases very large sums of money.

The point is that all of this is being done in spite of the wishes and best interests of the American people and even of Israel. It's not as though the American people voted to do bad things to the Palestinians: kill them, starve them, imprison them, steal from them, and control them. Quite the opposite: as Grant Smith explains, unbiased polls indicate that most Americans show no such support for Israel's mistreatment of the Palestinians and believe that if both sides would abide by international law, the Geneva Conventions, and the UN resolutions relating to Palestine, peace could be achieved between Jews and Arabs in Palestine.

But Zionism has a different agenda, an agenda that will use any means legal and illegal to promote its interests by getting the United States to back it up.
And that agenda is the problem because it is built on non-negotiable beliefs.

What can you say to someone who believes that the Bible mandates the establishment of a Jewish homeland in Palestine to the exclusion of the indigenous inhabitants?

Or, as Rabbi Ovaida Yosef said in 2010, that "The Goyim [non-Jews] are born only in order to serve us. Besides this, they have no place on earth -- only to serve the people Israel."

Not surprisingly, the never-ending "peace process" goes on and on, with no peace in sight.

The US, in spite of itself, continues to support this cruel charade against its own interests and at the expense of neighbors, friends, allies and innocent parties in Palestine and elsewhere in the world.

Grant Smith's excellent book is an attempt to raise America's awareness to the point that something might be done.

[May 13, 2019] America The Farewell Tour by Chris Hedges

Sep 05, 2018 | www.amazon.com
Chapter 1 - DECAY                                                1
Chapter 2 - HEROIN______________________________________________59
Chapter 3 - WORK________________________________________________83
Chapter 4 - SADISM_____________________________________________112
Chapter 5 - HATE_______________________________________________150
Chapter 6 - GAMВIING___________________________________________203
Chapter 7 - KKh KDOM___________________________________________230
Acknowledgments________________________________________________311
Notes----------------------------------------------------------315
Bibliography___________________________________________________351
Index----------------------------------------------------------359

I walked down a long service road into the remains of an abandoned lace factory. The road was pocked with holes Pilled with fetid water. There were saplings and weeds poking up from the cracks in the asphalt. Wooden crates, rusty machinery, broken glass, hulks of old Piling cabinets, and trash covered the grounds. The derelict complex, 288,000 square feet, consisted of two huge brick buildings connected by overhead, enclosed walkways.

The towering walls of the two buildings, with the service road running between them, were covered with ivy. The window panes were empty or had frames jagged with shards of glass. The thick wooden doors to the old loading docks stood agape. I entered the crumbling complex through a set of double wooden doors into a cavernous hall.

The wreckage of industrial America lay before me, home to flocks of pigeons that, startled by my footsteps over the pieces of glass and rotting floorboards, swiftly left their perches in the rafters and air ducts high above my head. They swooped, bleating and clucking, over the abandoned looms.

The Scranton Lace Company was America. It employed more than 1,200 workers on its imported looms, some of the largest ever built.

Gary Moreau, Author TOP 500 REVIEWER, September 5, 2018

Washington is fiddling but it is the capitalist collective that is setting the fires

Throughout history, all great civilizations have ultimately decayed. And America will not be an exception, according to former journalist and war correspondent, Chris Hedges. And while Hedges doesn't offer a date, he maintains we are in the final throes of implosion -- and it won't be pretty.

The book is thoroughly researched and the author knows his history. And despite some of the reviews it is not so much a political treatise as it is an exploration of the American underbelly -- drugs, suicide, sadism, hate, gambling, etc. And it's pretty dark; although he supports the picture he paints with ample statistics and first person accounts.

There is politics, but the politics provides the context for the decay. And it's not as one-dimensional as other reviewers seemed to perceive. Yes, he is no fan of Trump or the Republican leadership. But he is no fan of the Democratic shift to identity politics, or antifa, either.

One reviewer thought he was undermining Christianity but I didn't get that. He does not support "prosperity gospel" theology, but I didn't see any attempt to undermine fundamental religious doctrine. He is, after all, a graduate of Harvard Divinity School and an ordained Presbyterian minister.

He puts the bulk of the blame for the current state of decay, in fact, where few other writers do -- squarely on the back of capitalist America and the super-companies who now dominate nearly every industry. The social and political division we are now witnessing, in other words, has been orchestrated by the capital class; the class of investors, banks, and hedge fund managers who don't create value so much as they transfer it to themselves from others with less power. And I think he's spot on right.

We have seen a complete merger of corporate and political America. Politicians on both sides of the aisle serve at the pleasure of the capitalist elite because they need their money to stay in power. Corporations enjoy all the rights of citizenship save voting, but who needs to actually cast a ballot when you can buy the election.

And what the corpocracy, as I call it, is doing with all that power is continuing to reshuffle the deck of economic opportunity to insure that wealth and income continue to polarize. It's a process they undertake in the name of tax cuts for the middle class (which aren't), deregulation (which hurts society as a whole), and the outright transfer of wealth and property (including millions of acres of taxpayer-owned land) from taxpayers to shareholders (the 1%).

I know because I was part of it. As a former CEO and member of four corporate boards I had a front row seat from the 1970s on. The simplest analogy is that the gamblers rose up and took control of the casinos and the government had their backs in a kind of quid pro quo, all having to do with money.

They made it stick because they turned corporate management into the ultimate capitalists. The people who used to manage companies and employees are now laser focused on managing the companies' stock price and enhancing their own wealth. Corporate executives, in a word, became capitalists, not businessmen and women, giving the foxes unfettered control of the hen house.

They got to that position through a combination of greed -- both corporate management's and that of shareholder activists -- but were enabled and empowered by Washington. Beginning in the 1970s the Justice Department antitrust division, the Labor Department, the EPA, and other institutions assigned the responsibility to avoid the concentration of power that Adam Smith warned us about, and to protect labor and the environment, were all gutted and stripped of power.

They blamed it on globalism, but that was the result, not the cause. Gone are the days of any corporate sense of responsibility to the employees, the collective good, or the communities in which they operate and whose many services they enjoy. It is the corporate and financial elite, and they are now one and the same, who have defined the "me" world in which we now live.

And the process continues: "The ruling corporate kleptocrats are political arsonists. They are carting cans of gasoline into government agencies, the courts, the White House, and Congress to burn down any structure or program that promotes the common good." And he's right. And Trump is carrying those cans.

Ironically, Trump's base, who have been most marginalized by the corpocracy, are the ones who put him there to continue the gutting. But Hedges has an explanation for that. "In short, when you are marginalized and rejected by society, life often has little meaning. There arises a yearning among the disempowered to become as omnipotent as the gods. The impossibility of omnipotence leads to its dark alternative -- destroying like the gods." (Reference to Ernest Becker's The Denial of Death.)

The economic history and understanding of economic theory here is rich and detailed. Capitalism, as Marx and others pointed out, creates great wealth in the beginning but is doomed to failure due to its inability to continue to find sources of growth and to manage inequities in wealth creation. And you don't have to be a socialist to see that this is true. Capitalism must be managed. And our government is currently making no attempt to do so. It is, in fact, dynamiting the institutions responsible for doing so.

All told, this is a very good book. If you don't like reading about underbellies (I found the chapter devoted to sadism personally unsettling, being the father of two daughters.) you will find some of it pretty dark. Having said that, however, the writing is very good and Hedges never wallows in the darkness. He's clearly not selling the underbelly; he's trying to give it definition.

I did think that some of the chapters might have been broken down into different sub-chapters and there is a lack of continuity in some places. All told, however, I do recommend the book. There is no denying the fundamental thesis.

The problem is, however, we're all blaming it on the proverbial 'other guy.' Perhaps this book will help us to understand the real culprit -- the capitalist collective. "The merging of the self with the capitalist collective has robbed us of our agency, creativity, capacity for self-reflection, and moral autonomy." True, indeed.


S. Ferguson , September 1, 2018

"Justice is a manifestation of Love..."

The inimitable Hedges is not only a saint with a penetrating intelligence, but also a man of superior eloquence with the power to pull you into his descriptions of the collapse of western civilization. Hedges says that the new American Capitalism no longer produces products -- rather America produces escapist fantasies. I found this paragraph [page 233] particularly relevant. The act of being dedicated to the 'greater good' has in itself become dangerous.

Chris Hedges: "We do not become autonomous and free human beings by building pathetic, tiny monuments to ourselves. It is through self-sacrifice and humility that we affirm the sanctity of others and the sanctity of ourselves. Those who fight against cultural malice have discovered that life is measured by infinitesimal and often unacknowledged acts of solidarity and kindness. These acts of kindness spin outward to connect our atomized and alienated souls to others. The good draws to it the good. This belief -- held although we may never see empirical proof -- is profoundly transformative. But know this: when these acts are carried out on behalf of the oppressed and the demonized, when compassion defines the core of our lives, when we understand that justice is a manifestation of love, we are marginalized and condemned by our sociopathic elites."

Amazon Customer , September 7, 2018
Great (Recycled) Hedges Rants

If you've never read Hedges - get it now. If you've read him before - there's nothing new here.

Chris Hedges is a writer who has a knack for seeing the big picture and connecting the dots. A chronic pessimist in the best sense, a bitter prophet warning us of the last days of the decaying empire, his page-turning prose carving through the morass of today's mania and derangement. For that, he's in the company somewhere between Cornel West and Morris Berman (the later, whose book Why America Failed, is better than this. If you're familiar with Hedges, but not Morris Berman, go find Berman instead).

I give this three stars only because there isn't much new here if you're familiar with his material. I felt this book to be an update of Empire of Illusion, punched up by old articles from his weekly column at Truthdig. Aside from the introductory chapter, he revisits themes of sadism, the decline of literacy, of labor, of democratic institutions, and so on, which are too familiar. The pages and pages detailing the BDSM craze I felt were excessive in their prurient voyeurism which journalistic approaches can fall into. Not saying he's wrong at all, but this tone could put off some readers, erring on excessive preacherly seminarian virtue signaling as he points out the sins of the world and shouts - "Look! Look at what we've done!"

swisher , August 21, 2018
I'd give a million stars if possible

Heartbreaking to read but so true. In our "truth is not truth" era Mr. Hedges once again writes the sad and shocking obituary for American Democracy and sounds the prophetic alarm to those revelers while Rome burns. All empires come and go but I never thought I'd be a witness to one. Something sick and traitorous has infected the soul of America and I fear it's going to be some demented combination of the worst elements in 1984 and Brave Bew World. The most important work currently published but will anyone listen? Will anything change?

ChrisD , September 5, 2018
Well worth reading - an important perspective

The author is honest and intelligent. When you take a detailed look at reality it can seem harsh.

Don't shoot the messenger who has brought bad news. We need to know the truth. Read, listen, learn. Engage in positive actions to improve the situation.
Chris has given us a wake-up call.

[May 11, 2019] A Texan Looks At Lyndon: A Study In Illegitimate Power

May 31, 2003 | www.amazon.com

Kurt Harding

A Devastating Diatribe, May 31, 2003

It would be an understatement to say that author Haley does not like Lyndon Baines Johnson. And despite the fact that his book is an unrelenting tirade against all things Lyndon, it provides a useful service in reminding the reader of how Johnson trampled and double-crossed friend and foe alike in his single-minded lust for power.

I am fairly conservative politically, but I am open-minded enough to recognize and oppose corruption whether practiced by liberals or conservatives. In my lifetime, Johnson, Nixon, and Clinton have been shining examples of the worst impulses in American presidential politics in which greed and lust for either power or money ended up overshadowing any of their real achievements.

Haley shows that Johnson was a man of few real principles, neither liberal nor conservative, but rather a man who usually always wanted to know which way the wind was blowing before taking a stand on any important issue. Johnson was a man who used all his powers of persuasion and veiled threats to get what he wanted and woe unto anyone who stood in his way.

He was a man who knew and used the old adage "It's not what you know, but who you know" to Machiavellian extremes.

But he was also a man of sometimes great political courage who would rarely give an inch once he took a stand. He hated those who opposed him, nursed resentments, and wreaked revenge on those who crossed him in the least as most of his enemies and many of his friends learned to their sorrow. From the earliest days, he was involved with corrupt Texas politicians from the local to the state level and swam in the seas of corporate corruption with the likes of the infamous swindler Billy Sol Estes and others of his stripe.

Admittedly, the conservatism of the author is the conservatism of a bygone age and the reader will recognize that the book is meant to be a partisan attack on Johnson. Some of the attacks on Johnson are made solely for political reasons as Johnson was clever enough to outmaneuver Haley's ideological brothers and sisters. But Johnson surrounded himself with enough scummy characters and got involved in so many underhanded political AND business deals that he deserves the rough treatment given him in Haley's devastating diatribe.

No matter your political leanings, your eyes will be opened when you read A Texan Looks At Lyndon. The book is well-written and often riveting in its allegations and revelations, but it loses one star for occasional hysteria. If US or Texas politics interests you, then I highly recommend this.

Randall Ivey

You have been warned, July 31, 2000

Haley wrote this book (and published it himself) in 1964 basically as a campaign tract for Barry Goldwater. In the intervening years it has become a classic of its kind, a philippic, to use M.E. Bradford's term, tracing the illegitimate rise to power of Lyndon Baines Johnson.

If you're politically naive, this book will grown hair on your chest. It's an unblinking, fearless portrait of Johnson's wheeling dealing and underhanded methods to achieve the power, prestige, and money he craved all his life.

Haley names all the names and lays out facts and figures for the reader to make up his mind. And the reader winds up shaking his head in utter astonishment. The best part of the book is that detailing Johnson's eventual election to the U.S. Senate in a contest with former Gov. Coke Stevenson.

The election was clearly Stevenson's, but through the machinations of George Parr, the notorious Duke of Duval County, the results were turned around in LBJ's favor. Investigators later found that among those voting in the primary were people who didn't live in the county anymore and people who weren't alive at all. But the results stood.

(An interesting and amusing aside: when Haley ran for Texas governor in 1956, he approached Parr and said, "I'm Evetts Haley. I'm running for governor, and if I win, it will be my privilege to put you in jail."

Parr's reply: "I believe you will." Parr, the Artful Dodger of Texas politics for years, eventually killed himself.)

At times the book grows tiresome, especially in the Bobby Baker and Billie Sol Estes scandals, where Haley turns a virtual torrent of names and numbers on the reader as to be sometimes confusing.

[Apr 27, 2019] The War on Normal People The Truth About America's Disappearing Jobs and Why Universal Basic Income Is Our Future Andrew Yang

Looks like this guys somewhat understands the problems with neoliberalism, but still is captured by neoliberal ideology.
Apr 27, 2019 | www.amazon.com

The logic of the meritocracy is leading us to ruin, because we arc collectively primed to ignore the voices of the millions getting pushed into economic distress by the grinding wheels of automation and innovation. We figure they're complaining or suffering because they're losers.

We need to break free of this logic of the marketplace before it's too late.

[Neoliberalism] had decimated the economies and cultures of these regions and were set to do the same to many others.

In response, American lives and families are falling apart. Ram- pant financial stress is the new normal. We are in the third or fourth inning of the greatest economic shift in the history of mankind, and no one seems to be talking about it or doing anything in response.

The Great Displacement didn't arrive overnight. It has been building for decades as the economy and labor market changed in response to improving technology, financialization, changing corporate norms, and globalization. In the 1970s, when my parents worked at GE and Blue Cross Blue Shield in upstate New York, their companies provided generous pensions and expected them to stay for decades. Community banks were boring businesses that lent money to local companies for a modest return. Over 20 percent of workers were unionized. Some economic problems existed -- growth was uneven and infla- tion periodically high. But income inequality was low, jobs provided benefits, and Main Street businesses were the drivers of the economy. There were only three television networks, and in my house we watched them on a TV with an antenna that we fiddled with to make the picture clearer.

That all seems awfully quaint today. Pensions disappeared for private-sector employees years ago. Most community banks were gobbled up by one of the mega-banks in the 1990s -- today five banks control 50 percent of the commercial banking industry, which itself mushroomed to the point where finance enjoys about 25 percent of all corporate profits. Union membership fell by 50 percent.

Ninety-four percent of the jobs created between 2005 and 2015 were temp or contractor jobs without benefits; people working multiple gigs to make ends meet is increasingly the norm. Real wages have been flat or even declining. The chances that an American born in 1990 will earn more than their parents are down to 50 percent; for Americans born in 1940 the same figure was 92 percent.

Thanks to Milton Friedman, Jack Welch, and other corporate titans, the goals of large companies began to change in the 1970s and early 1980s. The notion they espoused -- that a company exists only to maximize its share price -- became gospel in business schools and boardrooms around the country. Companies were pushed to adopt shareholder value as their sole measuring stick.

Hostile takeovers, shareholder lawsuits, and later activist hedge funds served as prompts to ensure that managers were committed to profitability at all costs. On the flip side, CF.Os were granted stock options for the first time that wedded their individual gain to the company's share price. The ratio of CF.O to worker pay rose from 20 to 1 in 1965 to 271 to 1 in 2016. Benefits were streamlined and reduced and the relationship between company and employee weakened to become more transactional.

Simultaneously, the major banks grew and evolved as Depression- era regulations separating consumer lending and investment banking were abolished. Financial deregulation started under Ronald Reagan in 1980 and culminated in the Financial Services Modernization Act of 1999 under Bill Clinton that really set the banks loose. The securi- ties industry grew 500 percent as a share of GDP between 1980 and the 2000s while ordinary bank deposits shrank from 70 percent to 50 percent. Financial products multiplied as even Main Street companies were driven to pursue financial engineering to manage their affairs. GE, my dad's old company and once a beacon of manufacturing, became the fifth biggest financial institution in the country by 2007.

Nolia Nessa , April 5, 2018

profound and urgent work of social criticism

It's hard to be in the year 2018 and not hear about the endless studies alarming the general public about coming labor automation. But what Yang provides in this book is two key things: automation has already been ravaging the country which has led to the great political polarization of today, and second, an actual vision into what happens when people lose jobs, and it definitely is a lightning strike of "oh crap"

I found this book relatively impressive and frightening. Yang, a former lawyer, entrepreneur, and non-profit leader, writes showing with inarguable data that when companies automate work and use new software, communities die, drug use increases, suicide increases, and crime skyrockets. The new jobs created go to big cities, the surviving talent leaves, and the remaining people lose hope and descend into madness. (as a student of psychology, this is not surprising)

He starts by painting the picture of the average American and how fragile they are economically. He deconstructs the labor predictions and how technology is going to ravage it. He discusses the future of work. He explains what has happened in technology and why it's suddenly a huge threat. He shows what this means: economic inequality rises, the people have less power, the voice of democracy is diminished, no one owns stocks, people get poorer etc. He shows that talent is leaving small towns, money is concentrating to big cities faster. He shows what happens when those other cities die (bad things), and then how the people react when they have no income (really bad things). He shows how retraining doesn't work and college is failing us. We don't invest in vocational skills, and our youth is underemployed pushed into freelance work making minimal pay. He shows how no one trusts the institutions anymore.

Then he discusses solutions with a focus on Universal Basic Income. I was a skeptic of the idea until I read this book. You literally walk away with this burning desire to prevent a Mad Max esque civil war, and its hard to argue with him. We don't have much time and our bloated micromanaged welfare programs cannot sustain.

[Apr 23, 2019] The Secret Team The CIA and Its Allies in Control of the United States and the World by L. Fletcher Prouty

Notable quotes:
"... The CIA is the center of a vast mechanism that specializes in Covert Operations ... or as Allen Dulles used to call it, "Peacetime Operations." ..."
"... the CIA is the willing tool of a higher level Secret Team, or High Cabal, that usually includes representatives of the CIA and other instrumentalities of the government, certain cells of the business and professional world and, almost always, foreign participation. It is this Secret Team, its allies, and its method of operation that are the principal subjects of this book. ..."
"... vast intergovernmental undercover infrastructure and its direct relationship with great private industries, mutual funds and investment houses, universities, and the news media, including foreign and domestic publishing houses. The Secret Team has very close affiliations with elements of power in more than three-score foreign countries and is able when it chooses to topple governments, to create governments, and to influence governments almost anywhere in the world. ..."
"... the power of the Team is enhanced by the "cult of the gun" and by its sometimes brutal and always arbitrary anti-Communist flag waving, even when real Communism had nothing to do with the matter at hand. ..."
"... To be a member, you don't question, you don't ask; it's "Get on the Team" or else. One of its most powerful weapons in the most political and powerful capitals of the world is that of exclusion. To be denied the "need to know" status, like being a member of the Team, even though one may have all the necessary clearances, is to be totally blackballed and eliminated from further participation. Politically, if you are cut from the Team and from its insider's knowledge, you are dead. In many ways and by many criteria the Secret Team is the inner sanctum of a new religious order. ..."
"... At the heart of the Team, of course, arc a handful of top executives of the CIA and of the National Security Council (NSC), most notably the chief White House adviser to the President on foreign policy affairs. ..."
"... It is often quite difficult to tell exactly who many of these men really are, because some may wear a uniform and the rank of general and really be with the CIA and others may be as inconspicuous as the executive assistant to some Cabinet officer's chief deputy. ..."
"... even more damaging to the coherent conduct of foreign and military affairs, it is a bewildering collection of semi-permanent or temporarily assembled action committees and networks that respond pretty much ad hoc to specific troubles and to flash-intelligence data inputs from various parts of the world, sometimes in ways that duplicate the activities of regular American missions, sometimes in ways that undermine those activities, and very often in ways that interfere with and muddle them. ..."
"... This report is a prime example of how the Secret Team, which has gained so much control over the vital foreign and political activities of this government, functions. ..."
"... Although even in his time he had seen the beginning of the move of the CIA into covert activities, there can be little doubt that the "diversion" to which he made reference was not one that he would have attributed to himself or to any other President. Rather, the fact that the CIA had gone into clandestine operations and had been "injected into peacetime cloak-and-dagger operations," and "has been so much removed from its intended role" was more properly attributable to the growing and secret pressures of some other power source. As he said, the CIA had become "a symbol of sinister and mysterious foreign intrigue." ..."
Apr 23, 2019 | www.amazon.com

I was the first author to point out that the CIA's most important "Cover Story" is that of an "Intelligence" agency. Of course the CIA does make use of "intelligence" and "intelligence gathering," but that is largely a front for its primary interest, "Fun and Games." The CIA is the center of a vast mechanism that specializes in Covert Operations ... or as Allen Dulles used to call it, "Peacetime Operations."

In this sense, the CIA is the willing tool of a higher level Secret Team, or High Cabal, that usually includes representatives of the CIA and other instrumentalities of the government, certain cells of the business and professional world and, almost always, foreign participation. It is this Secret Team, its allies, and its method of operation that are the principal subjects of this book.

It must be made clear that at the heart of Covert Operations is the denial by the "operator," i.e. the U.S. Government, of the existence of national sovereignty. The Covert operator can, and does, make the world his playground ... including the U.S.A. Today, early 1990, the most important events of this century are taking place with the ending of the "Cold War" era, and the beginning of the new age of "One World" under the control of businessmen and their lawyers, rather than the threat of military power. This scenario for change has been brought about by a series of Secret Team operations skillfully orchestrated while the contrived hostilities of the Cold War were at their zenith.

... ... ...

We may wish to note that in a book "Gentleman Spy, the Life of Allen Dulles" the author, Peter Grose cites Allen Dulles response to an invitation to the luncheon table from Hoover's Secretary of State, Henry L. Stimson. Allen Dulles assured his partners in the Sullivan & Cromwell law firm, "Let it be known quietly that I am a lawyer and not a diplomat." He could not have made a more characteristic and truthful statement about himself. He always made it clear that he did not "plan" his work, he was always the "lawyer" who carried out the orders of his client whether the President of the United States, or the President of the local bank.

The Secret Team (ST) being described herein consists of securitycleared individuals in and out of government who receive secret intelligence data gathered by the CIA and the National Security Agency (NSA) and who react to those data, when it seems appropriate to them, wide paramilitary plans and activities, e.g. training and "advising" -- a not exactly impenetrable euphemism for such things as leading into battle and actual combat -- Laotian tribal troops, Tibetan rebel horsemen, or Jordanian elite Palace Guards.

Membership on the Team, granted on a "need-to-know" basis, varies with the nature and location of the problems that come to its attention, and its origins derive from that sometimes elite band of men who served with the World War II Office of Strategic Services (OSS) under the father of them all, General "Wild Bill" William J. Donovan, and in the old CIA.

The power of the team derives from its vast intergovernmental undercover infrastructure and its direct relationship with great private industries, mutual funds and investment houses, universities, and the news media, including foreign and domestic publishing houses. The Secret Team has very close affiliations with elements of power in more than three-score foreign countries and is able when it chooses to topple governments, to create governments, and to influence governments almost anywhere in the world.

Whether or not the Secret Team had anything whatsoever to do with the deaths of Rafael Trujillo, Ngo Dinh Diem, Ngo Dinh Nhu, Dag Hammarskjold, John F. Kennedy, Robert F. Kennedy, Martin Luther King, and others may never be revealed, but what is known is that the power of the Team is enhanced by the "cult of the gun" and by its sometimes brutal and always arbitrary anti-Communist flag waving, even when real Communism had nothing to do with the matter at hand.

The Secret Team docs not like criticism, investigation, or his- tory and is always prone to see the world as divided into but two camps -- "Them" and "Us." Sometimes the distinction may be as little as one dot, as in "So. Viets" and "Soviets," the So. Viets being our friends in Indochina, and the Soviets being the enemy of that period. To be a member, you don't question, you don't ask; it's "Get on the Team" or else. One of its most powerful weapons in the most political and powerful capitals of the world is that of exclusion. To be denied the "need to know" status, like being a member of the Team, even though one may have all the necessary clearances, is to be totally blackballed and eliminated from further participation. Politically, if you are cut from the Team and from its insider's knowledge, you are dead. In many ways and by many criteria the Secret Team is the inner sanctum of a new religious order.

At the heart of the Team, of course, arc a handful of top executives of the CIA and of the National Security Council (NSC), most notably the chief White House adviser to the President on foreign policy affairs. Around them revolves a sort of inner ring of Presidential officials, civilians, and military men from the Pentagon, and career professionals of the intelligence community. It is often quite difficult to tell exactly who many of these men really are, because some may wear a uniform and the rank of general and really be with the CIA and others may be as inconspicuous as the executive assistant to some Cabinet officer's chief deputy.

Out beyond this ring is an extensive and intricate network of government officials with responsibility for, or expertise in, some specific field that touches on national security or foreign affairs: "Think Tank" analysts, businessmen who travel a lot or whose businesses (e.g., import-export or cargo airline operations) are useful, academic experts in this or that technical subject or geographic region, and quite importantly, alumni of the intelligence community -- a service from which there are no unconditional resignations. All true members of the Team remain in the power center whether in office with the incumbent administration or out of office with the hard-core set. They simply rotate to and from official jobs and the business world or the pleasant haven of academe.

Thus, the Secret Team is not a clandestine super-planning-board or super-general-staff. But even more damaging to the coherent conduct of foreign and military affairs, it is a bewildering collection of semi-permanent or temporarily assembled action committees and networks that respond pretty much ad hoc to specific troubles and to flash-intelligence data inputs from various parts of the world, sometimes in ways that duplicate the activities of regular American missions, sometimes in ways that undermine those activities, and very often in ways that interfere with and muddle them. At no time did the powerful and deft hand of the Secret Team evidence more catalytic influence than in the events of those final ninety days of 1963, which the "Pentagon Papers" were supposed to have exposed. The New York Times shocked the world on Sunday, June 13,1971, with the publication of the first elements of the Pentagon Papers.

The first document the Times selected to print was a trip report on the situation in Saigon, credited to the Secretary of Defense, Robert S. McNamara, and dated December 21,1963. This was the first such report on the situation in Indochina to be submitted to President Lyndon B. Johnson. It came less than thirty days after the assassination of President John F. Kennedy and less than sixty days after the assassinations of President Ngo Dinh Diem of South Vietnam and his brother and counselor Ngo Dinh Nhu.

Whether from some inner wisdom or real prescience or merely simple random selection, the Times chose to publish first from among the three thousand pages of analysis and tour thousand pages of official documents that had come into its hands that report which may stand out in history as one of the key documents affecting national policy in the past quarter-century -- not so much for what it said as for what it signified. This report is a prime example of how the Secret Team, which has gained so much control over the vital foreign and political activities of this government, functions.

... ... ...

...President Harry S. Truman, observing the turn of events since the death of President Kennedy, and pondering developments since his Administration, wrote for the Washington Post a column also datelined December 21, 1963:

For some time I have been disturbed by the way the CIA has been diverted from its original assignment. It has become an operational and at times a policy-making arm of the government.... I never had any thought that when I set up the CIA that it would be injected into peacetime cloak-and-dagger operations.

Some of the complications and embarrassment that I think we have experienced arc in part attributable to the fact that this quiet intelligence arm of the President has been so removed from its intended role that it is being interpreted as a symbol of sinister and mysterious foreign intrigue and a subject for cold war enemy propaganda.

Truman was disturbed by the events of the past ninety days, those ominous days of October, November, and December 1963. Men all over the world were disturbed by those events. Few men, however could have judged them with more wisdom and experience than Harry S. Truman, for it was he who, in late 1947, had signed unto law the National Security Act. This Act, in addition to establishing the Department of Defense (DOD) with a single Secretary at its head and with three equal and independent services -- the Army, Navy, and Air Force -- also provided for a National Security Council and the Central Intelligence Agency. And during those historic and sometimes tragic sixteen years since the Act had become law, he had witnessed changes that disturbed him, as he saw that the CIA "had been diverted" from the original assignment that he and the legislators who drafted the Act had so carefully planned.

Although even in his time he had seen the beginning of the move of the CIA into covert activities, there can be little doubt that the "diversion" to which he made reference was not one that he would have attributed to himself or to any other President. Rather, the fact that the CIA had gone into clandestine operations and had been "injected into peacetime cloak-and-dagger operations," and "has been so much removed from its intended role" was more properly attributable to the growing and secret pressures of some other power source. As he said, the CIA had become "a symbol of sinister and mysterious foreign intrigue."


5.0 out of 5 stars XXX

The New Corporate (non-State acting) Privatized One World Order December 4, 2012

While we sit stunned into complete disbelief and silence trying to make sense of, understand, and decode the strongly suspected connections between the most curious political and military events of our times, this author, Colonel, L. Fletcher Prouty, in this book, "The Secret Team," has already decoded everything for us. From the JFK assassination, Watergate, the Iran-Contra Affair, the Gulf of Tonkin incident, repeated bank bust-outs (like BCCI and Silverado), the cocaine connection from Mena Arkansas to Nicaragua, the "crack" cocaine explosion in America's inner cities, the recent housing crash, and the general Wall Street sponsored financial meltdown, and now even from the wildest recesses of our collective imagination (dare I say it, maybe even 911?), Colonel Prouty, the fabled Mr. "X" in the movie "JFK," has the bureaucratic structure of all the answers here.

What Colonel Prouty tells us is that right before our own eyes, we are experiencing a paradigm shift in international relations and world affairs, one that has quietly moved us from the "old order" where the sovereign nation and its armies and national ideologies once sat at the center of world events and predominated, into a new "One World business run corporate, privatized global order," in which "the corporate powers that be" sit on the throne in the clock tower; and where, as a result of their machinations, true national sovereignty has seeped away to the point that we say safely say, it no longer exists.

The good Colonel tells us that the most important events of this century are taking place right before our eyes, as the Cold War era has already given way to a new age of "One World" under the control of businessmen and their hired guns, their lawyers -- rather than under the threat of military power and ideological differences. In this new, completely "privatized world order," big business, big lawyers, big bankers, big politicians, big lobbyists, and even bigger money-men, run and rule the entire world from behind a national security screen inaccessible to the average citizen. It is this paradigm shift, and the wall of secrecy that has brought us the "Secret Team" and the series of strange inexplicable events that it has skillfully orchestrated, and that keep recurring from time to time both within the U.S. and throughout the world.

This new bureaucratic entity is called a "Secret Team" for good reasons: because like any team, it does not create its own game plan, its own rules, or its own reality. The team plays for a coach and an owner. It is the coach and the owner that writes the scripts, creates and "calls" the plays. The drama of reality that we see on the international screen is a creation of the "Power elite, as it is executed by the "secret Team." The power of the team comes from its vast intergovernmental undercover infrastructure and its direct relationship with private industries, the military, mutual funds, and investment houses, universities, and the news media, including foreign and domestic publishing houses. The beauty of the "Secret team," is that it is not a clandestine super-planning-board, or super-general staff like as is frequently attributed to the Bilderburg Group, or the Trilateral Commission, but is a bewildering collection of ad hoc and semi-permanent action committees and networks that can come into being and then dissolve as specific needs troubles and flash-points dictate. It can create, influence or topple governments around the globe at the behest and on the whim of its coaches, "the Power Elite."

As the Sociologist C. Wright Mills told us nearly a half century ago, the members of the "Power Elite," operate beyond national borders, beyond the reach of the public, and have no national loyalties -- or even return addresses. They operate in the shadows and run the world by remote control and by making us completely dependent upon them and their hidden machinations. Invisibly, they maneuver and jockey to control every aspect of our lives and the infrastructure and markets upon which we depend for our survival: The most important and essential among them being our ability to produce and distribute our own food, water, and energy. As a result of this dependency, and despite mythology to the contrary, Colonel Prouty tells us that we are becoming the most dependent society that has ever lived. And the future viability of an infrastructure that is not controlled and manipulated by this "global power Elite," is diminishing to the point of non-existence.

With climate changes and terrorism already causing serious disruptions in the normal flow of our lives, governments are becoming less and less able to serve as the people's protector of last resort. Already, one of the politicians who ran for President of the United States in its most recent election, Governor Mitt Romney, suggested that FEMA be turned over to a private run firm? And all of the agencies of government that he did not suggest be privatized (or that have not already been privatized), except for the military, he suggested be abolished. As well, we also see the concomitant rise of the Backwaters' of the world, a private firm that has already begun to take over a lion's share of the responsibilities of our volunteer military. Likewise, our prisons, healthcare system and schools are also being privatized, and everything else is being "outsourced" to the lowest bidder on the global labor market. The book however is not just about international politics or international economics, per se, but is also about the primary bureaucratic instrumentality through which the "Power Elite" operates. This instrumentality, as noted above, is called "the Secret Team."

How does Colonel L. Fletcher Prouty know about the "Secret Team:" because he used to be one of its Pentagon operational managers. I believe then that out of prudence, when the man who oversaw management of and liaised with "the Secret team" for nine years as a Pentagon as an Air Force Colonel, (and who incidentally was also sent on a wild goose chase to Antarctica in order to get him out of the country, days before the JFK assassination), tells us that something is wrong in Denmark, I believe it is high time to listen up. In a chilling narrative, Colonel Prouty relates to us how he found out about the assassination of JFK. It was during a stopover in New Zealand on his return from the wild goose chase his superiors had sent him on to get him out of the way. Hours BEFORE the assassination had even occurred, somehow the New Zealand press already had the pre-planned talking points on Lee Harvey Oswald. Somehow they mistakenly deployed them prematurely, reporting well in advance of the assassination itself, that Oswald was JFK's lone assassin? How could such a thing happen unless there was a very high level conspiracy?

The Secret team, according to Prouty consists of a bunch of renegade CIA intelligence operatives that are signed up for life and operate under the full protection and pay of the "Power Elite," itself a cabal of wealthy men with interlocking interests beholden only to their own hunger for power, profit and greed. The "Power Elite" relies upon this covert team of highly trained specialists to get things done without questions being asked and without moral squeamishness.

Operating outside the normal parameters of political authorization, morality, direction, and law, and hiding behind a wall shielded by national security secrecy, very much like the mafia, the "Secret Team" always gets the job done. They are allowed to ply their immoral trade with both impunity and with legal immunity. In short, in the modern era, in the new "One WorldCorporate Order," they have proven again and again that, at worse they are lawless, and at best, they are a law unto themselves. The members of the "Secret Team" have become the new Jack-booted foot soldiers we see trampling over our dying democracy. As we move deeper and deeper into the uncharted realms of the new Corporate run "One World Order," "we the people" have a lot of questions we must ask ourselves if the democracy we once knew is to endure.

The climax of the book appears here in chapter 22 ( entitled "Camelot.") It is a beautifully crafted object lesson for the future of what remains of our democracy. It is a narrative summary of how JFK tried but failed to deal with the emerging paradigm shift in power from the Executive branch of the UGS, to the CIA and the "Secret Team," that is to say, from a system of duly elected Representatives to one dictated by the whims of the "Power Elite" through their "Secret Team." JFK's assassination is just the most dramatic consequence of how our then young President failed to save the USG from usurpation of its power by a cabal of anonymous evil men intent on ruling the world. Colonel Prouty's story ends somewhat as follows.

The Bay of Pigs operation was the seminal event in the clandestine transfer of power from the "normal government" to the CIA's Secret Team." It was done primarily via the thinly transparent interface of the military -- playing a dual role as both military officers reporting to their Commander in Chief, and at the same time as undercover "clandestine operatives" reporting (behind the President's back) to the CIA (and of course through it, to the "Power Elite."). In the book, there is little question where their split loyalties lay.

The key ruse that provided the glue that made this high level "grifter-like scam" (with the U.S. President, as its "mark)" work to perfection, was the words "anti-Communist counterinsurgency." Put to skilful use in hands of trained Specialists, these words had a powerful and purposeful dual meaning. They meant one thing to "clandestine insider members of the "Secret Team," and quite another to "no need to know outsiders" like the American public (and in this case the whole USG, including the Commander in Chief, the President of the U.S. JFK himself). This willful ambiguity in terminology and the duality in the roles of those involved does most of the heavy lifting in the drama played out by the "insiders" and that resulted in the usurpation and the shift of power from the Presidency to the CIA.

The "Bay of Pigs operation"proved to be the defining, the seminal and pivotal case in point. It began as a small clandestine "anti-Communist counterinsurgency" operation run by the CIA (as also was the case with Iran, Guatemala, Nicaragua, Indonesia, Laos, Cambodia, Granada, Angola, and Santo Domingo), ostensibly under the oversight of the "USG," but in fact ended up as a huge CIA-run military failure, one minus the requisite oversight from the US President. The devil of how this happened lies in the slimy details that went on behind the scenes and that are skillfully unveiled in this book. They are details that the reader can also get from a careful reading between the lines of "The Pentagon Papers."

As the Bay of Pigs Operation slowly morphed from a small-scale USG run operation "with oversight," into a huge, expensive and poorly run CIA operation without any oversight whatsoever, the rules of the game also changed. They changed from being about U.S. security, to being about the greed, power and profits of the "Power Elite, as those objectives were implemented through the "Secret Team." The key to the "Power Elite" getting what they wanted was always accomplished by stoking the ideological fires up to an international boiling point, so that more and more military hardware could be produced, bought and sold.

Likewise, the roles of the primary players also morphed and changed -- from "clandestine operators" in military uniforms, to "military operators" reporting to their CIA handlers, and thus to the "Power Elite." The executive branch (the ostensible oversight body of the government) was none the wiser, since it was not yet aware that it was "being played" by the CIA and thus did not yet know it was being treated in the same way the public is normally treated: as an "excluded outsider" lacking the required "need to know."

Through this bureaucratic sleigh of hand, the partial control and power the USG normally exercised in its oversight role had been covertly usurped, as the military operators (and even members of the Presidents own staff proved to be "insiders," i.e., members of the "Secret Team," "playing" the President like a bass fiddle as he and his team became the "marks" in an insider's "con game" in which power and control of the USG was at stake.

When JFK finally "wised up," it was too late. By then the train had already left the station, with the CIA firmly in the driver's seat. Since JFK era, U.S. foreign policy has become a clear case of the CIA tail wagging the USG dog. And the best proof of the evil intentions of the "Secret Team" calling the shots within the CIA is that no sooner than the Bay of Pigs literally blew up in a spectacular and embarrassing failure did the CIA then put the wheels back in motion to duplicate, expand and even generalize this failed bureaucratic formulate in Vietnam.

But this time JFK was ready for them and issued NSM-55 and NSM-57, both of which were decision directives designed to put the brakes on the CIA and return the usurped power back to the military where the President was the Commander in Chief. But the CIA was already two steps ahead of JFK. His own staff had been so compromised that he had nowhere to turn? He was penetrated and thus effectively checkmated by an agency of his own government? The more he fought back, the more he lost ground, and the more his back was up against the wall. By the time November, 22, 1963 came around, JFK literally had no bureaucratic friends and nowhere to turn?

I only regret that an earlier edition of this book had been lying around unread in my library for more than a decade. Five Stars.

Stephen Courts , August 7, 2012

Secret Team (CIA) By Colonel Fletcher Prouty

Though this book is now over 40 years old, I found the information very relevant and 100% trustworthy from one of America's true Patriots. Colonel Prouty served his country for decades as a pilot and as an integral part of the Department of Defense and CIA. Though for nine years Colonel Prouty was the liason between the Air Force and the CIA's clandestine affairs, he is able to reveal confidential information that would typically be classified "Top Secret", because Colonel Prouty did not work for the CIA and therefore did not have to sign a confidentiality agreement with the nefarious CIA.

What is fascinating about Colonel Prouty is that he was everywhere throughout his career. He watched world affairs as they unfolded, meeting the most influencial leaders of his time. From FDR, Stalin, Churchill, Ike and every general and admiral in our military. For the nine years from 1954 to 1963, he was involved as the go to guy for the military leaders and the president, including both Ike and JFK. In other words, Colonel Prouty writes from personal and direct experience.

Now the meat of the book is about the creation and abuses of the 1947 created CIA. From the end of World War Two until the mid 1970's, the CIA abused its primary responsibility of intelligence gathering to literally unchecked clandestine and covert upheavels in every part of the world. The CIA, particularly under Allen Dulles, created one coup d'etat after another. The reader will realize that from 1945 until the Marines reached the shores of Viet Nam in 1965, every piece of skulldruggery in Viet Nam was done by the CIA. The CIA had infiltrated the entire government, from the Department of Defense to the Department of State. Many people would be shocked to know that what passed as Defense activity was acually generals and admirals, wearing their uniforms and working for the CIA. Whether it was advising the President, subverting Ambassadors or lying to Congress, the CIA ruled and few knew what they were really doing. Colonel Prouty tells the stories accurately of every subversive, nefarious act the CIA was involved in. One example in particular stands out. It was Ike's goal at the end of his 2nd term as president to have a peace conference with the USSR, one to sign a peace treaty and end the cold war. In direct violation of the presidents specific instructions not to fly U-2 flights prior to the conference in June of 1960, the CIA flew the ill fated Gary Powers flight that guaranteed that the conference would go forth. This was a most important conference that could have brought nuclear peace accords decades before they were eventually signed. Dulles and his henchmen deliberately insured that Gary Powers not only violated the order not to fly these observations flights, they insured that it would be downed by sabotaging the flight and thus force Ike to either admit he knew or fire the bastards who embarrassed him. Ike chose to take responsibility and thus the peace talks were cancelled. There was also another flight in 1958 that was downed in the Soviet Union.

Most Americans would be shocked to know the CIA has their own private air lines, Air America. This is no small air lines. Had Colonel Prouty written this book later, he could connect the CIA with the massive drug smuggling that has devastated American cities. They use the proceeds of this smuggling to finance their illicit involvement of other sovereign countries.

Bottom line is this is an important book as is his 1993 JFK & Viet Nam. Colonel Prouty was a significant advisor to Oliver Stone and his masterpiece, JFK. I am currently finishing the rereading of said book. If you want to know who has controled our foreign policy (against the charter that created this monstrosity) since the mid 1940's, this is an excellent book to begin with. It is my personal opinion, having read many books on the CIA, that their main function is to serve the multi-national corportations and the bankers that exploit the less developed countries around the world and to insure that there will never be peace. There will not be a World War Three, because nuclear weapons would most likely be used and earth as we know it will cease to exist. Therefore, limited, no win conflicts will continually persist. Beginning with Korea, to Viet Nam, to Iraq to Afganistan. The irony is we are wasting our human resources and our treasury to bankrupt our country while both Russia and China sit back and spend zero (USSR & Afganistan is the exception) and develope the kind of infrastruture and consumer goods as well as education that we should be doing.

Finally, the record of the CIA leaves a lot to be desired. There were many failures despite billions of dollars spent and the infiltration into every branch of our society, from education to media to think tanks to the military. Read this book and you will also discover the misadventure in Viet Nam that cost 58,000 plus American casualities, millions of Viet Namese, millions of service men who would never be the same after this debacle. Colonel Prouty explains this better than anyone I have yet to read. He predicted another debacle (Iraq & Afganistan) after the Viet Nam debacle. I believe Cononel Prouty passed away last decade, but he would not have been shocked by the rediculous misadventures in both of the above foremetioned countries. Think of the trillions of dollars and the bloodshed lost on a military misadventure that has no way of producing a positive outcome for the United States.

Stephen Courts
August 7, 2012

<img src="https://images-na.ssl-images-amazon.com/images/I/11bV+h26SGL._CR0,0,115,115_SX48_.jpg"> Jeff Marzano , December 17, 2014
An American Hero Reveals The Shocking Truth

This book provides a rare glimpse into the secret history and evil machinations of the CIA as it mutated from its original form between 1946 up until the time the book was published in 1973 when it had become a cancerous blight within the government.

It should not be surprising that most people never really understood the so called Vietnam War and they still don't. Even people in the American government like the Secretary Of Defense were completely confused and manipulated by the Agency as it's called.

President Kennedy was somewhat inexperienced when he first entered office. JFK thought he could handle problems in the government in the same way he handled problems during his presidential campaign. He had an informal style at first where he would just ask a friend to take care of it. This caused JFK to disregard important checks and balances which had been set up to hopefully prevent the CIA from crossing the line from being just an intelligence agency into the realm of initiating clandestine military operations.

The National Security Counsel was supposed to give direction to the CIA and then the Operations Coordination Board was supposed to verify that the CIA had done what they were told and only what they were told. But even before JFK got into office the Agency had taken many determined steps to undermine those controls.

JFK's informal style opened the door even wider for the Agency to circumvent whatever controls may have still been effective to put some sort of limits on their 'fun and games'. Having an informal style with them was dangerous because they were experts at getting around all sorts of rules and laws.

The Agency double crossed JFK during the Bay Of Pigs debacle. Publicly JFK took the blame for what happened but according to Fletcher it was the CIA who cancelled the air support that would have destroyed Fidel Castro's planes on the ground. As a result JFK's only options were to accept the blame or admit to the world that things were being done by the American military establishment that he wasn't even aware of. John Kennedy was a fast learner however and he stated that he would break the CIA up into a thousand tiny pieces. JFK was fed up with all of the Agency's fun and games.

Something similar happened with the Gary Powers U2 spy plane that had to land in the Soviet Union. The evil Secret Team sabotaged the U2 to derail President Eisenhower's lifelong dream of holding a worldwide peace summit. Like JFK Ike accepted the blame publicly.

Ike's only other option would have been to admit that the U2 flight was unauthorized and then fire Allan Dulles and the other leaders of the evil Secret Team. But Fletcher says Ike couldn't do this for various reasons even though Nikita Khrushchev probably realized that Eisenhower did not break his word and authorize the U2 mission.

Ike's comments about the Military Industrial Complex which he made during his farewell address turned out to be very prophetic indeed.

These examples provide the picture of an Agency that had become a law unto itself which reinterpreted whatever orders it was given to make those orders conform to their evil schemes. Fletcher provides many details in the book about how the Agency was able to circumvent laws and regulations and manipulate anyone and everyone in the government starting with the president. They did this mainly by abusing their control of secrecy but they used many other methods as well.

Secret Team leader Allan Dulles wrote a book called 'The Craft of Intelligence'. The title of this book sort of indicates the very problem Fletcher Prouty explains in his book. Dulles viewed himself as a sort of artist or craftsman who could distort information and make it appear in any form he wanted. Strangely Fletcher refers to his close personal friendship with Allan Dulles in the acknowledgements at the beginning of the book but then spends the rest of the book portraying Dulles as a sort of Joseph Goebbels figure.

Fletcher spends over 300 pages describing the metamorphosis which occurred with the CIA as it veered very far afield from what president Truman had intended when he created the Agency. Then towards the end of the book Fletcher finally reveals his shocking conclusions about what this massive abuse of power lead to.

Fletcher felt that the assassination of president Kennedy was the single most pivotal event in modern American history as far as the changes that the assassination caused.

Sadly as Fletcher points out the Vietnam War never really had any military objective. The theory was that if South Vietnam fell this would cause a domino effect and the dreaded communism monster would start gobbling up the entire world. Then when South Vietnam did fall with no domino effect the Secret Team published a group of documents called the Pentagon Papers. These documents deflected blame away from the CIA and said nobody listened to the CIA when they warned that the Vietnam situation was not winnable.

But it wouldn't matter if anyone listened to the Secret Team anyway because they always lie.

This book presents an American government in chaos during the Vietnam era. It was a government that had been high jacked by the evil Secret Team.

After the Bay Of Pigs incident Fidel Castro apparently got fed up with the CIA and America in general. Castro turned to the Soviet Union instead. This lead to the Cuban Missile Crisis. It was only in the last 10 years or so that people realized just how close the world came to an all out nuclear exchange at that time.

This was a very dangerous game master craftsman Allan Dulles and his other liars were playing. They were like kids starting fires all over the place in a big field and then just sitting back and seeing which of those fires would become an inferno as Vietnam did.

Also in recent years people have implicated Lyndon Johnson as being part of the conspiracy to assassination JFK. So LBJ was on the team also.

I'm not sure if Fletcher ever really spells out what the true motivations of the Secret Team were but he hints at it. Probably the three main reasons that people engage in criminal activity are sex, money, and revenge. Usually when crimes are committed there's a money trail somewhere. And in the case of government military spending that's a very long trail.

This is a serious book which contains many details about an approximately 25 year period that began after World War II. It is not light reading.

Watch this documentary series on the internet. The hypocrites have pulled it off the market:

The Men Who Killed Kennedy

The Men Who Killed Kennedy DVD Series - Episode List

1. "The Coup D'Etat" (25 October 1988)
2. "The Forces Of Darkness" (25 October 1988)
3. "The Cover-Up" (20 November 1991)
4. "The Patsy" (21 November 1991)
5. "The Witnesses" (21 November 1991)
6. "The Truth Shall Set You Free" (1995)

The Final Chapter episodes (internet only):

7. "The Smoking Guns" (2003)
8. "The Love Affair" (2003)
9. "The Guilty Men" (2003)

Herman , February 4, 2017
Extensive analysis of the CIA from its inception to the 1970's

The fact that this book all but disappeared when it was distributed in the 1970's tells all that the CIA did not want any of its "dirty laundry" aired in public. Prouty does an excellent (almost over the top) job of describing the rise and strategies and evolution of the CIA up through the 70's. That the Vietnam War was still controlled by the CIA at the writing of the original book also shows JFK had not gained control of the military-industrial complex. For those who are wanting to fill in more pieces of the puzzle this is an excellent source from a man who found himself in the thick of things for many years. The one shot-coming comes in the last chapter in his description of Nixon and especially LBJ not being able to control the military industrial complex either.

Consequent independent research over many years seems to show LBJ who was about to go to jail and be dropped from the 1964 ticket, knew about and helped cover up the JFK assassination and is known to have remarked: "Just get me elected and you can have your damn war".

There is also evidence Nixon and company undermined the 1968 peace talks as LBJ was trying to end the war and LBJ actually called Nixon and asked him to back off. ( Kinda like the Oct 1980 surprise by Reagan). Consequently we know from Judyth Vary Baker that Lee Oswald was the the assassin of JFK and he in fact was on the payroll of the FBI and CIA.

James E Files has confessed to being one of the shooters and E. Howard Hunt told his son, he was involved and he was CIA at the time. But no One man can possibly know everything. Given the pervasive infiltration of government, military and probably many civil institutions by the CIA, one wonders who comprises the shadow government in reality?

Boyce Hart , July 22, 2010
The Critical Sinews btw CIA and other Gov. Agencies

What does it mean when we say " the CIA did such and such an action"? Just what is the CIA, a whole or a part? Given its emphasis on compartmentalization, is it accurate to say "the CIA was heavily involved in the JFK assassination" or would it be more accurate to say parts of the CIA were? Moreover, who is the CIA, and what are the powers behind it? Also, perhaps most importantly, what were the relations between the CIA and other parts of government, and how and when did these relationships change and evolve. Were these changes done democratically or secretly. These last two questions are the essence of this book. Yes, it is true as one reviewer noted, this book could have used an editor. Some times it has the feel of a collection of speeches, but not always. So why the five instead of 4. The subject matter-- in particular the last two questions typed above-- are just too rarely mentioned and discussed. This book really helps us understand the curiously evolving nervous system of the CIA btw 1947 and 1963, as very very few other books do. It sees the inception of the CIA in 1947 as just the first step, and makes it clear that later developments were neither willed nor pre-ordained by many of the elected officials who wrote the National Security Act of 1947.

The only other book that really addresses this BETWEEN WORLD--i.e. between CIA and other government agencies is one of the Three most important books published in the last 50 years IMO. Thy Will Be Done: Nelson Rockefeller, Evangelism, and the Conquest of the Amazon In the Age of OIl by Colby and Dennett. Thy Will Be Done: The Conquest of the Amazon : Nelson Rockefeller and Evangelism in the Age of Oil

Still there is one book I recommend even more than that one. This is not the current Gold Standard merely for all current JFK research. It is far more than that; it is the Gold Standard for all US Cold War History Research. JFK and the Unspeakable: Why He Died and Why It Matters by James W. Douglass. This book is so important because it is not merely who done it but why done it. It is a book that mixes how and why of JFK and those crucial-because-contestable Cold War years 1960-63 like no other. JFK and the Unspeakable: Why He Died and Why It Matters

Luc REYNAERT , November 30, 2008
A symbol of sinister and mysterious foreign intrigue (H. Truman)

This is an extremely important book. The proof of it is that even the official copy in the Library of Congress disappeared (!). Moreover, even after his death, the author continues to be the object of a smear campaign (see internet).

His book is not less than a frontal attack on US intelligence and concomitantly on those who control it.
Its portrait of Allen Dulles, a longtime intelligence director, says it all: `I am a lawyer'; in other words, a servant. But of whom?
This book unveils the existence of a secret cabal, a Power Elite (G. William Domhoff), a `deep State' (P.D. Scott) within the US and its government as well as in about 40 host countries.
This Power Elite uses the Secret Team of top intelligence and military commanders as its long arm and protects it. Together they stand above the law and the democratic process. They get things done, whether they have the political authorization or not.
They dispose of a vast undercover political, military, intelligence, business, media and academic infrastructure, in the US as well as worldwide. They don't respect the nation State and are able to create, to influence and to topple governments in the hemisphere controlled by them.

The author gives a remarkable insight into the inner workings, the logistics, the strategies and the tactics of the intelligence agency. Its creation and history show that President H. Truman never intended to create an autonomous operational agency in the clandestine field. L.F. Prouty also gives valuable information about the U2- G. Powers incident (apparently to torpedo the US/USSR peace talks) and the Pentagon papers (an intelligence whitewash).

At the end, the author poses the all important question: `Can any President ever be strong enough really to rule?'

This book is a must read for all those interested in US history and for all those who want to understand the world we live in.

For more information on the Power Elite, I recommend the works of O. Tunander, D. Estulin, Peter Dale Scott, Carroll Quigley, Gary Allen and G. W. Domhoff.

anarchteacher , April 30, 2008
An Insider's Candid Expose' of the National Security State

As in the case of the brilliant Jules Archer volume, The Plot To Seize The White House, it is terrific to have this masterful study of the inner workings of the early CIA back in print after so many years of unavailability.

Skyhorse Publishing is to be commended in seeing to it that both of these crucial works are again available to the attentive reading public who want to know the truth concerning our dark hidden history that the government has so actively strived to keep buried.

The late Colonel L. Fletcher Prouty served as chief of special operations for the Joint Chiefs of Staff where he was in charge of the global system designed to provide military support for covert activities of the Central Intelligence Agency.

In Oliver Stone's highly acclaimed film on the assassination of President John Fitzgerald Kennedy, JFK, the mysterious character "X" portrayed by Donald Sutherland was in fact Colonel Prouty, who assisted director Stone in the production and scripting of this historical epic. Prouty had relayed the shocking information detailed in the movie to the actual New Orleans District Attorney Jim Garrison, played by Kevin Cosner, in a series of communiques.

The Secret Team was first published in 1973 during the Watergate scandal, when many Americans were first learning about the dark side of covert government, an outlaw executive branch headed by a renegade chief of state. Richard Nixon would not be the last of this foul breed.

This was years before Frank Church's Senate Committee's damning revelations of CIA misdeeds and assassination plots against foreign leaders rocked the nation.

In each chapter in his book, Prouty speaks frankly with an insiders knowledge of what he describes as the inner workings of "the Secret Team."

This prudential judgment and keen assessment of the National Security Establishment was gained from years as a behind-the-scenes seasoned professional in military intelligence working intimately with those of the highest rank in policy making and implimentation.

The important story Prouty boldly tells should be read by every reflective American.

SER , December 6, 2001
Best Book On CIA Misdeeds

The author was the liason officer between the CIA and the military during the 50's and 60's. As an air force officer (Colonel), he was excempt from taking the CIA oath of secrecy and therefore was in a position to write the book in 1973. Apparently, shortly after the book's publication, almost all copies disappeared, probably bought up by the CIA. I was lucky to find a copy, published in Taiwan (Imperial Books & Records), in a used bookstore several years ago. The author details not only how the CIA conducts its operations, but more importantly, how it manages to keep most or all of its deeds from the eyes of congress, the population and even the President, if necessary. This is the best book I've read on the secret workings of the CIA and its misdeeds during the 50' and early 60's. Not to belittle them, but The Secret Team is a far more informative book than Marchetti and Marks' The CIA And The Cult Of Intelligence....

added, Jan09:

Actually, practically ever since I posted the review, I've been wanting to write a more detailed one, but since it's now been some 20 years since I read the book, I can't remember enough details to do it justice. If I ever reread it, I'll be sure to post a better review. I frankly think my present "review" isn't much of one - and it was cut short after my reference to the Marchetti/Marks book, the linking to which was not allowed at the time.

For example, one item of considerable current interest which I remember from the book is the author's detailing of Operation Northwoods, from the early 1960's - the plan by the intelligence agencies to conduct a false flag attack against American interests and blame it on Cuba, in order to justify a war against that country.
There was a big deal made about this (deservedly, in my opinion), only four or five years ago, when the National Security Archive (an apparently independent non-governmental research institute at George Washington University) discovered the details of this proposed operation, supposedly for the first time, in declassified documents. (This was in light of the ongoing conspiratorial controversies surrounding the 9-11 events.)
Yet, author Prouty detailed Operation Northwoods in his The Secret Team, first published long ago in 1973.
This is but one detail that indicates a much-needed elaborate review of this book.

I'd like to also add (since it is now apparently allowed) that The Secret Team, among other items, is available on CD from the L. Fletcher Prouty Reference Site: http://www.prouty.org/

Finally, for readers still obsessed with the JFK assassination, I would like to recommend Final Judgment - The Missing Link in the JFK Assassination Conspiracy, by Michael Collins Piper, a book which lives up to it's title. My use of the word "obsessed" is not meant derogatorily, as I have my own bookshelf-full as testament to that particular subject, but as an inducement to read the book, which will make the big picture very clear indeed. Do yourselves the favor.

Last edit: Jan09

Michael Tozer , July 7, 2006
Great!

Colonel Prouty's book on the Secret Team should be required reading for all concerned Americans. Herein, the author, a retired Air Force Colonel and CIA insider, reveals for all to see the machinations of the Secret Team and their impact on US history in the post World War II era. This is terribly important information.

I was particularly impressed with Prouty's depiction of Eisenhower's peace initiative and how it was sabatoged by the Secret Team. Ike was preparing for his peace summit with Kruschev when Gary Powers was sent off on his fool's errand on April 30th, 1960, a date with significant occult emblematics. The capture of Powers by the Soviets effectively scuttled the Eisenhower peace plan, which would have ruined the plans of the Secret Team, for continued Cold War tension, and treasure for the merchants of venom.

The essential truths in this important book are still relevant today. Of course, the ineffectual George Walker Bush is not entirely in charge of American foreign policy in this critical time. He is certainly still being manipulated by the sucessors of the Secret Team depicted in this excellent and well written book. Any serious student of American foreign policy in the post World War II era ought to read this important book.

D. E. Tench , May 24, 2013
Conspiracy History - not Theory!

The Colonel's book contains valuable and legitimate insider information about how factions within our government have been dishonest, selfish, and ruthlessly brutal for a very long time now. He shows the reader more than one vivid picture of how our American Presidents are routinely hoodwinked and manipulated by CIA moles - covert operators who often work within the D.C. Beltway.

I only wish he had expanded on the following statement (from page 15 of the 1973 edition): "There were and are many men who are not in government who are prime movers of Secret Team activity." Perhaps he knew enough to mention their connection to and influence over the Agency, but not enough to elaborate upon it. Or perhaps he knew better than to push that topic too far if he wanted to get published. In 1973 there were no on-demand self-publishing formats like what we have available to us today.

Prouty also mentions the non-governmental elements of secrecy in Chapter 23, but it's closer to a defining of terms than an elaboration. He ends the book with a view of the Secret Team as an evolved and faceless mechanism serving the cause of anti-Communism. Today, the enemy du jour is anti-terrorism. However, I argue that secret teams are never faceless, but made up of individuals.

The Secret Team that Col. Prouty revealed was part of a larger Secret Team. My book: "Know Your Enemy: Exposing Satan's Human Army" discusses the spiritual state of secretive operators and some of what scripture reveals on the topic.

[Apr 04, 2019] Fascism A Warning by Madeleine Albright

Junk author, junk book of the butcher of Yugoslavia who would be hanged with Bill clinton by Nuremberg Tribunal for crimes against peace. Albright is not bright at all. she a female bully and that shows.
Mostly projection. And this arrogant warmonger like to exercise in Russophobia (which was the main part of the USSR which saved the world fro fascism, sacrificing around 20 million people) This book is book of denial of genocide against Iraqis and Serbian population where bombing with uranium enriched bombs doubled cancer cases.If you can pass over those facts that this book is for you.
Like Robert Kagan and other neocons Albright is waiving authoritarism dead chicken again and again. that's silly and disingenuous. authoritarism is a method of Governance used in military. It is not an ideology. Fascism is an ideology, a flavor of far right nationalism. Kind of "enhanced" by some socialist ideas far right nationalism.
The view of fascism without economic circumstances that create fascism, and first of immiseration of middle and working class and high level of unemployment is a primitive ahistorical view. Fascism is the ultimate capitalist statism acting simultaneously as the civil religion for the population also enforced by the power of the state. It has a lot of common with neoliberalism, that's why neoliberalism is sometimes called "inverted totalitarism".
In reality fascism while remaining the dictatorship of capitalists for capitalist and the national part of financial oligarchy, it like neoliberalism directed against working class fascism comes to power on the populist slogans of righting wrong by previous regime and kicking foreign capitalists and national compradors (which in Germany turned to be mostly Jewish) out.
It comes to power under the slogans of stopping the distribution of wealth up and elimination of the class of reinters -- all citizens should earn income, not get it from bond and other investments (often in reality doing completely the opposite).
While intrinsically connected and financed by a sizable part of national elite which often consist of far right military leadership, a part of financial oligarchy and large part of lower middle class (small properties) is is a protest movement which want to revenge for the humiliation and prefer military style organization of the society to democracy as more potent weapon to achieve this goal.
Like any far right movement the rise of fascism and neo-fascism is a sign of internal problem within a given society, often a threat to the state or social order.
Apr 04, 2019 | www.amazon.com

Still another noted that Fascism is often linked to people who are part of a distinct ethnic or racial group, who are under economic stress, and who feel that they are being denied rewards to which they are entitled. "It's not so much what people have." she said, "but what they think they should have -- and what they fear." Fear is why Fascism's emotional reach can extend to all levels of society. No political movement can flourish without popular support, but Fascism is as dependent on the wealthy and powerful as it is on the man or woman in the street -- on those who have much to lose and those who have nothing at all.

This insight made us think that Fascism should perhaps be viewed less as a political ideology than as a means for seizing and holding power. For example, Italy in the 1920s included self-described Fascists of the left (who advocated a dictatorship of the dispossessed), of the right (who argued for an authoritarian corporatist state), and of the center (who sought a return to absolute monarchy). The German National Socialist Party (the

Nazis) originally came together ar ound a list of demands that ca- tered to anti-Semites, anti-immigrants, and anti-capitalists but also advocated for higher old-age pensions, more educational op- portunities for the poor, an end to child labor, and improved ma- ternal health care. The Nazis were racists and, in their own minds, reformers at the same time.

If Fascism concerns itself less with specific policies than with finding a pathway to power, what about the tactics of lead- ership? My students remarked that the Fascist chiefs we remem- ber best were charismatic. Through one method or another, each established an emotional link to the crowd and, like the central figure in a cult, brought deep and often ugly feelings to the sur- face. This is how the tentacles of Fascism spread inside a democ- racy. Unlike a monarchy or a military dictatorship imposed on society from above. Fascism draws energy from men and women who are upset because of a lost war, a lost job, a memory of hu- miliation, or a sense that their country is in steep decline. The more painful the grounds for resentment, the easier it is for a Fascist leader to gam followers by dangling the prospect of re- newal or by vowing to take back what has been stolen.

Like the mobilizers of more benign movements, these secular evangelists exploit the near-universal human desire to be part of a meaningful quest. The more gifted among them have an apti- tude for spectacle -- for orchestrating mass gatherings complete with martial music, incendiary rhetoric, loud cheers, and arm-

lifting salutes. To loyalists, they offer the prize of membership in a club from which others, often the objects of ridicule, are kept out. To build fervor, Fascists tend to be aggressive, militaristic, and -- when circumstances allow -- expansionist. To secure the future, they turn schools into seminaries for true believers, striv- ing to produce "new men" and "new women" who will obey without question or pause. And, as one of my students observed, "a Fascist who launches his career by being voted into office will have a claim to legitimacy that others do not."

After climbing into a position of power, what comes next: How does a Fascist consolidate authority? Here several students piped up: "By controlling information." Added another, "And that's one reason we have so much cause to worry today." Most of us have thought of the technological revolution primarily as a means for people from different walks of life to connect with one another, trade ideas, and develop a keener understanding of why men and women act as they do -- in other words, to sharpen our perceptions of truth. That's still the case, but now we are not so sure. There is a troubling "Big Brother" angle because of the mountain of personal data being uploaded into social media. If an advertiser can use that information to home in on a consumer because of his or her individual interests, what's to stop a Fascist government from doing the same? "Suppose I go to a demonstra- tion like the Women's March," said a student, "and post a photo

on social media. My name gets added to a list and that list can end up anywhere. How do we protect ourselves against that?"

Even more disturbing is the ability shown by rogue regimes and their agents to spread lies on phony websites and Facebook. Further, technology has made it possible for extremist organiza- tions to construct echo chambers of support for conspiracy theo- ries, false narratives, and ignorant views on religion and race. This is the first rule of deception: repeated often enough, almost any statement, story, or smear can start to sound plausible. The Internet should be an ally of freedom and a gateway to knowledge; in some cases, it is neither.

Historian Robert Paxton begins one of his books by assert- ing: "Fascism was the major political innovation of the twentieth century, and the source of much of its pain." Over the years, he and other scholars have developed lists of the many moving parts that Fascism entails. Toward the end of our discussion, my class sought to articulate a comparable list.

Fascism, most of the students agreed, is an extreme form of authoritarian rule. Citizens are required to do exactly what lead- ers say they must do, nothing more, nothing less. The doctrine is linked to rabid nationalism. It also turns the traditional social contract upside down. Instead of citizens giving power to the state in exchange for the protection of their rights, power begins with the leader, and the people have no rights. Under Fascism,

the mission of citizens is to serve; the government's job is to rule.

When one talks about this subject, confusion often arises about the difference between Fascism and such related concepts as totalitarianism, dictatorship, despotism, tyranny, autocracy, and so on. As an academic, I might be tempted to wander into that thicket, but as a former diplomat, I am primarily concerned with actions, not labels. To my mind, a Fascist is someone who identifies strongly with and claims to speak for a whole nation or group, is unconcerned with the rights of others, and is willing to use whatever means are necessary -- including violence -- to achieve his or her goals. In that conception, a Fascist will likely be a tyrant, but a tyrant need not be a Fascist.

Often the difference can be seen in who is trusted with the guns. In seventeenth-century Europe, when Catholic aristocrats did battle with Protestant aristocrats, they fought over scripture but agreed not to distribute weapons to their peasants, thinking it safer to wage war with mercenary armies. Modern dictators also tend to be wary of their citizens, which is why they create royal guards and other elite security units to ensure their personal safe- ty. A Fascist, however, expects the crowd to have his back. Where kings try to settle people down, Fascists stir them up so that when the fighting begins, their foot soldiers have the will and the firepower to strike first.


petarsimic , October 21, 2018

Madeleine Albright on million Iraqis dead: "We think the price is worth It"

Hypocrisy at its worst from a lady who advocated hawkish foreign policy which included the most sustained bombing campaign since Vietnam, when, in 1998, Clinton began almost daily attacks on Iraq in the so-called no-fly zones, and made so-called regime change in Iraq official U.S. policy.

In May of 1996, 60 Minutes aired an interview with Madeleine Albright, who at the time was Clinton's U.N. ambassador. Correspondent Leslie Stahl said to Albright, in connection with the Clinton administration presiding over the most devastating regime of sanctions in history that the U.N. estimated took the lives of as many as a million Iraqis, the vast majority of them children. , "We have heard that a half-million children have died. I mean, that's more children than died in Hiroshima. And -- and, you know, is the price worth it?"

Madeleine Albright replied, "I think this is a very hard choice, but the price -- we think the price is worth it.

<img src="https://images-na.ssl-images-amazon.com/images/S/amazon-avatars-global/default._CR0,0,1024,1024_SX48_.png"> P. Bierre , June 11, 2018
Does Albright present a comprehensive enough understanding of fascism to instruct on how best to avoid it?

While I found much of the story-telling in "Fascism" engaging, I come away expecting much more of one of our nation's pre-eminent senior diplomats . In a nutshell, she has devoted a whole volume to describing the ascent of intolerant fascism and its many faces, but punted on the question "How should we thwart fascism going forward?"

Even that question leaves me a bit unsatisfied, since it is couched in double-negative syntax. The thing there is an appetite for, among the readers of this book who are looking for more than hand-wringing about neofascism, is a unifying title or phrase which captures in single-positive syntax that which Albright prefers over fascism. What would that be? And, how do we pursue it, nurture it, spread it and secure it going forward? What is it?

I think Albright would perhaps be willing to rally around "Good Government" as the theme her book skirts tangentially from the dark periphery of fascistic government. "Virtuous Government"? "Effective Government"? "Responsive Government"?

People concerned about neofascism want to know what we should be doing right now to avoid getting sidetracked into a dark alley of future history comparable to the Nazi brown shirt or Mussolini black shirt epochs. Does Albright present a comprehensive enough understanding of fascism to instruct on how best to avoid it? Or, is this just another hand-wringing exercise, a la "you'll know it when you see it", with a proactive superficiality stuck at the level of pejorative labelling of current styles of government and national leaders? If all you can say is what you don't want, then the challenge of threading the political future of the US is left unruddered. To make an analogy to driving a car, if you don't know your destination, and only can get navigational prompts such as "don't turn here" or "don't go down that street", then what are the chances of arriving at a purposive destination?

The other part of this book I find off-putting is that Albright, though having served as Secretary of State, never talks about the heavy burden of responsibility that falls on a head of state. She doesn't seem to empathize at all with the challenge of top leadership. Her perspective is that of the detached critic. For instance, in discussing President Duterte of the Philippines, she fails to paint the dire situation under which he rose to national leadership responsibility: Islamic separatists having violently taken over the entire city of Marawi, nor the ubiquitous spread of drug cartel power to the level where control over law enforcement was already ceded to the gangs in many places...entire islands and city neighborhoods run by mafia organizations. It's easy to sit back and criticize Duterte's unleashing of vigilante justice -- What was Mrs. Albright's better alternative to regain ground from vicious, well-armed criminal organizations? The distancing from leadership responsibility makes Albright's treatment of the Philippines twin crises of gang-rule and Islamist revolutionaries seem like so much academic navel-gazing....OK for an undergrad course at Georgetown maybe, but unworthy of someone who served in a position of high responsibility. Duterte is liked in the Philippines. What he did snapped back the power of the cartels, and returned a deserved sense of security to average Philippinos (at least those not involved with narcotics). Is that not good government, given the horrendous circumstances Duterte came up to deal with? What lack of responsibility in former Philippine leadership allowed things to get so out of control? Is it possible that Democrats and liberals are afraid to be tough, when toughness is what is needed? I'd much rather read an account from an average Philippino about the positive impacts of the vigilante campaign, than listen of Madame Secretary sermonizing out of context about Duterte. OK, he's not your idea of a nice guy. Would you rather sit back, prattle on about the rule of law and due process while Islamic terrorists wrest control over where you live? Would you prefer the leadership of a drug cartel boss to Duterte?

My critique is offered in a constructive manner. I would certainly encourage Albright (or anyone!) to write a book in a positive voice about what it's going to take to have good national government in the US going forward, and to help spread such abundance globally. I would define "good" as the capability to make consistently good policy decisions, ones that continue to look good in hindsight, 10, 20 or 30 years later. What does that take?

I would submit that the essential "preserving democracy" process component is having a population that is adequately prepared for collaborative problem-solving. Some understanding of history is helpful, but it's simply not enough. Much more essential is for every young person to experience team problem-solving, in both its cooperative and competitive aspects. Every young person needs to experience a team leadership role, and to appreciate what it takes from leaders to forge constructive design from competing ideas and champions. Only after serving as a referee will a young person understand the limits to "passion" that individual contributors should bring to the party. Only after moderating and herding cats will a young person know how to interact productively with leaders and other contributors. Much of the skill is counter-instinctual. It's knowing how to express ideas...how to field criticism....how to nudge people along in the desired direction...and how to avoid ad-hominem attacks, exaggerations, accusations and speculative grievances. It's learning how to manage conflict productively toward excellence. Way too few of our young people are learning these skills, and way too few of our journalists know how to play a constructive role in managing communications toward successful complex problem-solving. Albright's claim that a journalist's job is primarily to "hold leaders accountable" really betrays an absolving of responsibility for the media as a partner in good government -- it doesn't say whether the media are active players on the problem-solving team (which they have to be for success), or mere spectators with no responsibility for the outcome. If the latter, then journalism becomes an irritant, picking at the scabs over and over, but without any forward progress. When the media takes up a stance as an "opponent" of leadership, you end up with poor problem-solving results....the system is fighting itself instead of making forward progress.

"Fascism" doesn't do nearly enough to promote the teaching of practical civics 101 skills, not just to the kids going into public administration, but to everyone. For, it is in the norms of civility, their ability to be practiced, and their defense against excesses, that fascism (e.g., Antifa) is kept at bay.
Everyone in a democracy has to know the basics:
• when entering a disagreement, don't personalize it
• never demonize an opponent
• keep a focus on the goal of agreement and moving forward
• never tell another person what they think, but ask (non-rhetorically) what they think then be prepared to listen and absorb
• do not speak untruths or exaggerate to make an argument
• do not speculate grievance
• understand truth gathering as a process; detect when certainty is being bluffed; question sources
• recognize impasse and unproductive argumentation and STOP IT
• know how to introduce a referee or moderator to regain productive collaboration
• avoid ad hominem attacks
• don't take things personally that wrankle you;
• give the benefit of the doubt in an ambiguous situation
• don't jump to conclusions
• don't reward theatrical manipulation

These basics of collaborative problem-solving are the guts of a "liberal democracy" that can face down the most complex challenges and dilemmas.

I gave the book 3 stars for the great story-telling, and Albright has been part of a great story of late 20th century history. If she would have told us how to prevent fascism going forward, and how to roll it back in "hard case" countries like North Korea and Sudan, I would have given her a 5. I'm not that interested in picking apart the failure cases of history...they teach mostly negative exemplars. Much rather I would like to read about positive exemplars of great national government -- "great" defined by popular acclaim, by the actual ones governed. Where are we seeing that today? Canada? Australia? Interestingly, both of these positive exemplars have strict immigration policies.

Is it possible that Albright is just unable, by virtue of her narrow escape from Communist Czechoslovakia and acceptance in NYC as a transplant, to see that an optimum immigration policy in the US, something like Canada's or Australia's, is not the looming face of fascism, but rather a move to keep it safely in its corner in coming decades? At least, she admits to her being biased by her life story.

That suggests her views on refugees and illegal immigrants as deserving of unlimited rights to migrate into the US might be the kind of cloaked extremism that she is warning us about.

Anat Hadad , January 19, 2019
"Fascism is not an exception to humanity, but part of it."

Albright's book is a comprehensive look at recent history regarding the rise and fall of fascist leaders; as well as detailing leaders in nations that are starting to mimic fascist ideals. Instead of a neat definition, she uses examples to bolster her thesis of what are essential aspects of fascism. Albright dedicates each section of the book to a leader or regime that enforces fascist values and conveys this to the reader through historical events and exposition while also peppering in details of her time as Secretary of State. The climax (and 'warning'), comes at the end, where Albright applies what she has been discussing to the current state of affairs in the US and abroad.

Overall, I would characterize this as an enjoyable and relatively easy read. I think the biggest strength of this book is how Albright uses history, previous examples of leaders and regimes, to demonstrate what fascism looks like and contributing factors on a national and individual level. I appreciated that she lets these examples speak for themselves of the dangers and subtleties of a fascist society, which made the book more fascinating and less of a textbook. Her brief descriptions of her time as Secretary of State were intriguing and made me more interested in her first book, 'Madame Secretary'. The book does seem a bit slow as it is not until the end that Albright blatantly reveals the relevance of all of the history relayed in the first couple hundred pages. The last few chapters are dedicated to the reveal: the Trump administration and how it has affected global politics. Although, she never outright calls Trump a fascist, instead letting the reader decide based on his decisions and what you have read in the book leading up to this point, her stance is quite clear by the end. I was surprised at what I shared politically with Albright, mainly in immigration and a belief of empathy and understanding for others. However, I got a slight sense of anti-secularism in the form of a disdain for those who do not subscribe to an Abrahamic religion and she seemed to hint at this being partly an opening to fascism.

I also could have done without the both-sides-ism she would occasionally push, which seems to be a tactic used to encourage people to 'unite against Trump'. These are small annoyances I had with the book, my main critique is the view Albright takes on democracy. If anything, the book should have been called "Democracy: the Answer" because that is the most consistent stance Albright takes throughout. She seems to overlook many of the atrocities the US and other nations have committed in the name of democracy and the negative consequences of capitalism, instead, justifying negative actions with the excuse of 'it is for democracy and everyone wants that' and criticizing those who criticize capitalism.

She does not do a good job of conveying the difference between a communist country like Russia and a socialist country like those found in Scandinavia and seems okay with the idea of the reader lumping them all together in a poor light. That being said, I would still recommend this book for anyone's TBR as the message is essential for today, that the current world of political affairs is, at least somewhat, teetering on a precipice and we are in need of as many strong leaders as possible who are willing to uphold democratic ideals on the world stage and mindful constituents who will vote them in.

Matthew T , May 29, 2018
An easy read, but incredibly ignorant and one eyed in far too many instances

The book is very well written, easy to read, and follows a pretty standard formula making it accessible to the average reader. However, it suffers immensely from, what I suspect are, deeply ingrained political biases from the author.

Whilst I don't dispute the criteria the author applies in defining fascism, or the targets she cites as examples, the first bias creeps in here when one realises the examples chosen are traditional easy targets for the US (with the exception of Turkey). The same criteria would define a country like Singapore perfectly as fascist, yet the country (or Malaysia) does not receive a mention in the book.

Further, it grossly glosses over what Ms. Albright terms facist traits from the US governments of the past. If the author is to be believed, the CIA is holier than thou, never intervened anywhere or did anything that wasn't with the best interests of democracy at heart, and American foreign policy has always existed to build friendships and help out their buddies. To someone ingrained in this rhetoric for years I am sure this is an easy pill to swallow, but to the rest of the world it makes a number of assertions in the book come across as incredibly naive. out of 5 stars Trite and opaque

Avid reader , December 20, 2018
Biast much? Still a good start into the problem

We went with my husband to the presentation of this book at UPenn with Albright before it came out and Madeleine's spunk, wit and just glorious brightness almost blinded me. This is a 2.5 star book, because 81 year old author does not really tell you all there is to tell when she opens up on a subject in any particular chapter, especially if it concerns current US interest.

Lets start from the beginning of the book. What really stood out, the missing 3rd Germany ally, Japan and its emperor. Hirohito (1901-1989) was emperor of Japan from 1926 until his death in 1989. He took over at a time of rising democratic sentiment, but his country soon turned toward ultra-nationalism and militarism. During World War II (1939-45), Japan attacked nearly all of its Asian neighbors, allied itself with Nazi Germany and launched a surprise assault on the U.S. naval base at Pearl Harbor, forcing US to enter the war in 1941. Hirohito was never indicted as a war criminal! does he deserve at least a chapter in her book?

Oh and by the way, did author mention anything about sanctions against Germany for invading Austria, Czechoslovakia, Romania and Poland? Up until the Pearl Harbor USA and Germany still traded, although in March 1939, FDR slapped a 25% tariff on all German goods. Like Trump is doing right now to some of US trading partners.

Next monster that deserves a chapter on Genocide in cosmic proportions post WW2 is communist leader of China Mao Zedung. Mr Dikötter, who has been studying Chinese rural history from 1958 to 1962, when the nation was facing a famine, compared the systematic torture, brutality, starvation and killing of Chinese peasants compares to the Second World War in its magnitude. At least 45 million people were worked, starved or beaten to death in China over these four years; the total worldwide death toll of the Second World War was 55 million.

We learn that Argentina has given sanctuary to Nazi war criminals, but she forgets to mention that 88 Nazi scientists arrived in the United States in 1945 and were promptly put to work. For example, Wernher von Braun was the brains behind the V-2 rocket program, but had intimate knowledge of what was going on in the concentration camps. Von Braun himself hand-picked people from horrific places, including Buchenwald concentration camp. Tsk-Tsk Madeline.

What else? Oh, lets just say that like Madelaine Albright my husband is Jewish and lost extensive family to Holocoust. Ukrainian nationalists executed his great grandfather on gistapo orders, his great grandmother disappeared in concentration camp, grandfather was conscripted in june 1940 and decommissioned september 1945 and went through war as infantryman through 3 fronts earning several medals. his grandmother, an ukrainian born jew was a doctor in a military hospital in Saint Petersburg survived famine and saved several children during blockade. So unlike Maideline who was raised as a Roman Catholic, my husband grew up in a quiet jewish family in that territory that Stalin grabbed from Poland in 1939, in a polish turn ukrainian city called Lvov(Lemberg). His family also had to ask for an asylum, only they had to escape their home in Ukraine in 1991. He was told then "You are a nice little Zid (Jew), we will kill you last" If you think things in ukraine changed, think again, few weeks ago in Kiev Roma gypsies were killed and injured during pogroms, and nobody despite witnesses went to jail. Also during demonstrations openly on the streets C14 unit is waving swastikas and Heils. Why is is not mentioned anywhere in the book? is is because Hunter Biden sits on the board of one of Ukraine's largest natural gas companies called Burisma since May 14, 2014, and Ukraine has an estimated 127.9 trillion cubic feet of unproved technically recoverable shale gas resources? ( according to the U.S. Energy Information Administration (EIA).1 The most promising shale reserves appear to be in the Carpathian Foreland Basin (also called the Lviv-Volyn Basin), which extends across Western Ukraine from Poland into Romania, and the Dnieper-Donets Basin in the East (which borders Russia).
Wow, i bet you did not know that. how ugly are politics, even this book that could have been so much greater if the author told the whole ugly story. And how scary that there are countries where you can go and openly be fascist.

&amp;amp;amp;amp;lt;img src="https://images-na.ssl-images-amazon.com/images/S/amazon-avatars-global/0e64e0cb-01e4-4e58-bcae-bba690344095._CR0,0.0,333,333_SX48_.jpg"&amp;amp;amp;amp;gt; NJ , February 3, 2019
Interesting...yes. Useful...hmmm

To me, Fascism fails for the single reason that no two fascist leaders are alike. Learning about one or a few, in a highly cursory fashion like in this book or in great detail, is unlikely to provide one with any answers on how to prevent the rise of another or fend against some such. And, as much as we are witnessing the rise of numerous democratic or quasi-democratic "strongmen" around the world in global politics, it is difficult to brand any of them as fascist in the orthodox sense.

As the author writes at the outset, it is difficult to separate a fascist from a tyrant or a dictator. A fascist is a majoritarian who rouses a large group under some national, racial or similar flag with rallying cries demanding suppression or exculcation of those excluded from this group. A typical fascist leader loves her yes-men and hates those who disagree: she does not mind using violence to suppress dissidents. A fascist has no qualms using propaganda to popularize the agreeable "facts" and theories while debunking the inconvenient as lies. What is not discussed explicitly in the book are perhaps some positive traits that separate fascists from other types of tyrants: fascists are rarely lazy, stupid or prone to doing things for only personal gains. They differ from the benevolent dictators for their record of using heavy oppression against their dissidents. Fascists, like all dictators, change rules to suit themselves, take control of state organizations to exercise total control and use "our class is the greatest" and "kick others" to fuel their programs.

Despite such a detailed list, each fascist is different from each other. There is little that even Ms Albright's fascists - from Mussolini and Hitler to Stalin to the Kims to Chavez or Erdogan - have in common. In fact, most of the opponents of some of these dictators/leaders would calll them by many other choice words but not fascists. The circumstances that gave rise to these leaders were highly different and so were their rules, methods and achievements.

The point, once again, is that none of the strongmen leaders around the world could be easily categorized as fascists. Or even if they do, assigning them with such a tag and learning about some other such leaders is unlikely to help. The history discussed in the book is interesting but disjointed, perfunctory and simplistic. Ms Albright's selection is also debatable.

Strong leaders who suppress those they deem as opponents have wreaked immense harms and are a threat to all civil societies. They come in more shades and colours than terms we have in our vocabulary (dictators, tyrants, fascists, despots, autocrats etc). A study of such tyrant is needed for anyone with an interest in history, politics, or societal well-being. Despite Ms Albright's phenomenal knowledge, experience, credentials, personal history and intentions, this book is perhaps not the best place to objectively learn much about the risks from the type of things some current leaders are doing or deeming as right.

Gderf , February 15, 2019
Wrong warning

Each time I get concerned about Trump's rhetoric or past actions I read idiotic opinions, like those of our second worst ever Secretary of State, and come to appreciate him more. Pejorative terms like fascism or populism have no place in a rational policy discussion. Both are blatant attempts to apply a pejorative to any disagreeing opinion. More than half of the book is fluffed with background of Albright, Hitler and Mussolini. Wikipedia is more informative. The rest has snippets of more modern dictators, many of whom are either socialists or attained power through a reaction to failed socialism, as did Hitler. She squirms mightily to liken Trump to Hitler. It's much easier to see that Sanders is like Maduro. The USA is following a path more like Venezuela than Germany.

Her history misses that Mussolini was a socialist before he was a fascist, and Nazism in Germany was a reaction to Wiemar socialism. The danger of fascism in the US is far greater from the left than from the right. America is far left of where the USSR ever was. Remember than Marx observed that Russia was not ready for a proletarian revolution. The USA with ready made capitalism for reform fits Marx's pattern much better. Progressives deny that Sanders and Warren are socialists. If not they are what Lenin called "useful idiots."
Albright says that she is proud of the speech where she called the USA the 'Indispensable Nation.' She should be ashamed. Obama followed in his inaugural address, saying that we are "the indispensable nation, responsible for world security." That turned into a policy of human rights interventions leading to open ended wars (Syria, Yemen), nations in chaos (Libya), and distrust of the USA (Egypt, Russia, Turkey, Tunisia, Israel, NK). Trump now has to make nice with dictators to allay their fears that we are out to replace them.
She admires the good intentions of human rights intervention, ignoring the results. She says Obama had some success without citing a single instance. He has apologized for Libya, but needs many more apologies. She says Obama foreign policy has had some success, with no mention of a single instance. Like many progressives, she confuses good intentions with performance. Democracy spreading by well intentioned humanitarian intervention has resulted in a succession of open ended war or anarchy.

The shorter histories of Czechoslovakia, Yugoslavia and Venezuela are much more informative, although more a warning against socialism than right wing fascism. Viktor Orban in Hungary is another reaction to socialism.

Albright ends the book with a forlorn hope that we need a Lincoln or Mandela, exactly what our two party dictatorship will not generate as it yields ever worse and worse candidates for our democracy to vote upon, even as our great society utopia generates ever more power for weak presidents to spend our money and continue wrong headed foreign policy.

The greatest danger to the USA is not fascism, but of excessively poor leadership continuing our slow slide to the bottom.

[Apr 02, 2019] Mr Cohen and I Live on Different Planets

Apr 02, 2019 | www.amazon.com

Looks like the reviewer is a typical neocon. Typical neocon think tanks talking points that reflect the "Full Spectrum Dominance" agenda.

As for Ukraine yes, of course, Victoria Nuland did not interfere with the event, did not push for the deposing Yanukovich to spoil agreement reached between him and the EU diplomats ("F**k EU" as this high level US diplomat eloquently expressed herself) and to appoint the US stooge Yatsenyuk. The transcript of Nuland's phone call actually introduced many Americans to the previously obscure Yatsenyuk.

And the large amount of cash confiscated in the Kiev office of Yulia Timostchenko Batkivshchina party (the main opposition party at the time run by Yatsenyuk, as Timoshenko was in jail) was just a hallucination. It has nothing to do with ";bombing with dollars"; -- yet another typical color revolution trick.

BTW "government snipers of rooftops" also is a standard false flag operation used to instill uprising at the critical moment of the color revolution. Ukraine was not the first and is not the last. One participant recently confessed. The key person in this false flag operation was the opposition leader Andriy Parubiy -- who was responsible for the security of the opposition camp. Google "Parubiy and snipergate" for more information.

His view on DNC hack (which most probably was a leak) also does not withstand close scrutiny. William Binney, a former National Security Agency high level official who co-authored an analysis of a group of former intelligence professionals thinks that this was a transfer to the local USB drive as the speed of downloads was too high for the Internet connection. In this light the death of Seth Rich looks very suspicious indeed.

As for Russiagate, he now needs to print his review and the portrait of Grand Wizard of Russiagate Rachel Maddow, shed both of them and eat with Borscht ;-)

[Apr 01, 2019] Amazon.com War with Russia From Putin Ukraine to Trump Russiagate (9781510745810) Stephen F. Cohen Books

Highly recommended!
Important book. Kindle sample
Notable quotes:
"... Washington has made many policies strongly influenced by' the demonizing of Putin -- a personal vilification far exceeding any ever applied to Soviet Russia's latter-day Communist leaders. ..."
"... As with all institutions, the demonization of Putin has its own history'. When he first appeared on the world scene as Boris Yeltsin's anointed successor, in 1999-2000, Putin was welcomed by' leading representatives of the US political-media establishment. The New York Times ' chief Moscow correspondent and other verifiers reported that Russia's new leader had an "emotional commitment to building a strong democracy." Two years later, President George W. Bush lauded his summit with Putin and "the beginning of a very' constructive relationship."' ..."
"... But the Putin-friendly narrative soon gave away to unrelenting Putin-bashing. In 2004, Times columnist Nicholas Kristof inadvertently explained why, at least partially. Kristof complained bitterly' of having been "suckered by' Mr. Putin. He is not a sober version of Boris Yeltsin." By 2006, a Wall Street Journal editor, expressing the establishment's revised opinion, declared it "time we start thinking of Vladimir Putin's Russia as an enemy of the United States." 10 , 11 The rest, as they' say, is history'. ..."
"... In America and elsewhere in the West, however, only purported "minuses" reckon in the extreme vilifying, or anti-cult, of Putin. Many are substantially uninformed, based on highly selective or unverified sources, and motivated by political grievances, including those of several Yeltsin-era oligarchs and their agents in the West. ..."
"... Putin is not the man who, after coming to power in 2000, "de-democratized" a Russian democracy established by President Boris Yeltsin in the 1990s and restored a system akin to Soviet "totalitarianism." ..."
"... Nor did Putim then make himself a tsar or Soviet-like autocrat, which means a despot with absolute power to turn his will into policy, the last Kremlin leader with that kind of power was Stalin, who died in 1953, and with him his 20-year mass terror. ..."
"... Putin is not a Kremlin leader who "reveres Stalin" and whose "Russia is a gangster shadow of Stalin's Soviet Union." 13 , 14 These assertions are so far-fetched and uninfoimed about Stalin's terror-ridden regime, Putin, and Russia today, they barely warrant comment. ..."
"... Nor did Putin create post-Soviet Russia's "kleptocratic economic system," with its oligarchic and other widespread corruption. This too took shape under Yeltsin during the Kremlin's shock-therapy "privatization" schemes of the 1990s, when the "swindlers and thieves" still denounced by today's opposition actually emerged. ..."
"... Which brings us to the most sinister allegation against him: Putin, trained as "a KGB thug," regularly orders the killing of inconvenient journalists and personal enemies, like a "mafia state boss." ..."
"... More recently, there is yet another allegation: Putin is a fascist and white supremacist. The accusation is made mostly, it seems, by people wishing to deflect attention from the role being played by neo-Nazis in US-backed Ukraine. ..."
"... Finally, at least for now. there is the ramifying demonization allegation that, as a foreign-policy leader. Putin has been exceedingly "aggressive" abroad and his behavior has been the sole cause of the new cold war. ..."
"... Embedded in the "aggressive Putin" axiom are two others. One is that Putin is a neo-Soviet leader who seeks to restore the Soviet Union at the expense of Russia's neighbors. Fie is obsessively misquoted as having said, in 2005, "The collapse of the Soviet Union was the greatest geopolitical catastrophe of the twentieth century," apparently ranking it above two World Wars. What he actually said was "a major geopolitical catastrophe of the twentieth century," as it was for most Russians. ..."
"... The other fallacious sub-axiom is that Putin has always been "anti-Western," specifically "anti-American," has "always viewed the United States" with "smoldering suspicions." -- so much that eventually he set into motion a "Plot Against America." ..."
"... Or, until he finally concluded that Russia would never be treated as an equal and that NATO had encroached too close, Putin was a full partner in the US-European clubs of major world leaders? Indeed, as late as May 2018, contrary to Russiagate allegations, he still hoped, as he had from the beginning, to rebuild Russia partly through economic partnerships with the West: "To attract capital from friendly companies and countries, we need good relations with Europe and with the whole world, including the United States." 3 " ..."
"... A few years earlier, Putin remarkably admitted that initially he had "illusions" about foreign policy, without specifying which. Perhaps he meant this, spoken at the end of 2017: "Our most serious mistake in relations with the West is that we trusted you too much. And your mistake is that you took that trust as weakness and abused it." 34 ..."
"... <img src="https://images-na.ssl-images-amazon.com/images/S/amazon-avatars-global/default._CR0,0,1024,1024_SX48_.png"> P. Philips ..."
"... "In a Time of Universal Deceit -- Telling the Truth Is a Revolutionary Act" ..."
"... Professor Cohen is indeed a patriot of the highest order. The American and "Globalists" elites, particularly the dysfunctional United Kingdom, are engaging in a war of nerves with Russia. This war, which could turn nuclear for reasons discussed in this important book, is of no benefit to any person or nation. ..."
"... If you are a viewer of one of the legacy media outlets, be it Cable Television networks, with the exception of Tucker Carlson on Fox who has Professor Cohen as a frequent guest, or newspapers such as The New York Times, you have been exposed to falsehoods by remarkably ignorant individuals; ignorant of history, of the true nature of Russia (which defeated the Nazis in Europe at a loss of millions of lives) and most important, of actual military experience. America is neither an invincible or exceptional nation. And for those familiar with terminology of ancient history, it appears the so-called elites are suffering from hubris. ..."
Apr 01, 2019 | www.amazon.com

THE SPECTER OF AN EVIL-DOING VLADIMIR PUTIN HAS loomed over and undermined US thinking about Russia for at least a decade. Inescapably, it is therefore a theme that runs through this book. Henry' Kissinger deserves credit for having warned, perhaps alone among prominent American political figures, against this badly distorted image of Russia's leader since 2000: "The demonization of Vladimir Putin is not a policy. It is an alibi for not having one." 4

But Kissinger was also wrong. Washington has made many policies strongly influenced by' the demonizing of Putin -- a personal vilification far exceeding any ever applied to Soviet Russia's latter-day Communist leaders. Those policies spread from growing complaints in the early 2000s to US- Russian proxy wars in Georgia, Ukraine, Syria, and eventually even at home, in Russiagate allegations. Indeed, policy-makers adopted an earlier formulation by the late Senator .Tolm McCain as an integral part of a new and more dangerous Cold War: "Putin [is] an unreconstructed Russian imperialist and K.G.B. apparatchik.... His world is a brutish, cynical place.... We must prevent the darkness of Mr. Putin's world from befalling more of humanity'." 3

Mainstream media outlets have play'ed a major prosecutorial role in the demonization. Far from aty'pically', the Washington Post's editorial page editor wrote, "Putin likes to make the bodies bounce.... The rule-by-fear is Soviet, but this time there is no ideology -- only a noxious mixture of personal aggrandizement, xenophobia, homophobia and primitive anti-Americanism." 6 Esteemed publications and writers now routinely degrade themselves by competing to denigrate "the flabbily muscled form" of the "small gray ghoul named Vladimir Putin." 7 , 8 There are hundreds of such examples, if not more, over many years. Vilifying Russia's leader has become a canon in the orthodox US narrative of the new Cold War.

As with all institutions, the demonization of Putin has its own history'. When he first appeared on the world scene as Boris Yeltsin's anointed successor, in 1999-2000, Putin was welcomed by' leading representatives of the US political-media establishment. The New York Times ' chief Moscow correspondent and other verifiers reported that Russia's new leader had an "emotional commitment to building a strong democracy." Two years later, President George W. Bush lauded his summit with Putin and "the beginning of a very' constructive relationship."'

But the Putin-friendly narrative soon gave away to unrelenting Putin-bashing. In 2004, Times columnist Nicholas Kristof inadvertently explained why, at least partially. Kristof complained bitterly' of having been "suckered by' Mr. Putin. He is not a sober version of Boris Yeltsin." By 2006, a Wall Street Journal editor, expressing the establishment's revised opinion, declared it "time we start thinking of Vladimir Putin's Russia as an enemy of the United States." 10 , 11 The rest, as they' say, is history'.

Who has Putin really been during his many years in power? We may' have to leave this large, complex question to future historians, when materials for full biographical study -- memoirs, archive documents, and others -- are available. Even so, it may surprise readers to know that Russia's own historians, policy intellectuals, and journalists already argue publicly and differ considerably as to the "pluses and minuses" of Putin's leadership. (My own evaluation is somewhere in the middle.)

In America and elsewhere in the West, however, only purported "minuses" reckon in the extreme vilifying, or anti-cult, of Putin. Many are substantially uninformed, based on highly selective or unverified sources, and motivated by political grievances, including those of several Yeltsin-era oligarchs and their agents in the West.

By identifying and examining, however briefly, the primary "minuses" that underpin the demonization of Putin, we can understand at least who he is not:

Embedded in the "aggressive Putin" axiom are two others. One is that Putin is a neo-Soviet leader who seeks to restore the Soviet Union at the expense of Russia's neighbors. Fie is obsessively misquoted as having said, in 2005, "The collapse of the Soviet Union was the greatest geopolitical catastrophe of the twentieth century," apparently ranking it above two World Wars. What he actually said was "a major geopolitical catastrophe of the twentieth century," as it was for most Russians.

Though often critical of the Soviet system and its two formative leaders, Lenin and Stalin, Putin, like most of his generation, naturally remains in part a Soviet person. But what he said in 2010 reflects his real perspective and that of very many other Russians: "Anyone who does not regret the break-up of the Soviet Union has no heart. Anyone who wants its rebirth in its previous form has no head." 28 , 29

The other fallacious sub-axiom is that Putin has always been "anti-Western," specifically "anti-American," has "always viewed the United States" with "smoldering suspicions." -- so much that eventually he set into motion a "Plot Against America." 30 , 31 A simple reading of his years in power tells us otherwise. A Westernized Russian, Putin came to the presidency in 2000 in the still prevailing tradition of Gorbachev and Yeltsin -- in hope of a "strategic friendship and partnership" with the United States.

How else to explain Putin's abundant assistant to US forces fighting in Afghanistan after 9/1 1 and continued facilitation of supplying American and NATO troops there? Or his backing of harsh sanctions against Iran's nuclear ambitions and refusal to sell Tehran a highly effective air-defense system? Or the information his intelligence services shared with Washington that if heeded could have prevented the Boston Marathon bombings in April 2012?

Or, until he finally concluded that Russia would never be treated as an equal and that NATO had encroached too close, Putin was a full partner in the US-European clubs of major world leaders? Indeed, as late as May 2018, contrary to Russiagate allegations, he still hoped, as he had from the beginning, to rebuild Russia partly through economic partnerships with the West: "To attract capital from friendly companies and countries, we need good relations with Europe and with the whole world, including the United States." 3 "

Given all that has happened during the past nearly two decades -- particularly what Putin and other Russian leaders perceive to have happened -- it would be remarkable if his views of the W^est, especially America, had not changed. As he remarked in 2018, "We all change." 33

A few years earlier, Putin remarkably admitted that initially he had "illusions" about foreign policy, without specifying which. Perhaps he meant this, spoken at the end of 2017: "Our most serious mistake in relations with the West is that we trusted you too much. And your mistake is that you took that trust as weakness and abused it." 34


P. Philips , December 6, 2018

"In a Time of Universal Deceit -- Telling the Truth Is a Revolutionary Act"

"In a Time of Universal Deceit -- Telling the Truth Is a Revolutionary Act" is a well known quotation (but probably not of George Orwell). And in telling the truth about Russia and that the current "war of nerves" is not in the interests of either the American People or national security, Professor Cohen in this book has in fact done a revolutionary act.

Like a denizen of Plato's cave, or being in the film the Matrix, most people have no idea what the truth is. And the questions raised by Professor Cohen are a great service in the cause of the truth. As Professor Cohen writes in his introduction To His Readers:

"My scholarly work -- my biography of Nikolai Bukharin and essays collected in Rethinking the Soviet Experience and Soviet Fates and Lost Alternatives, for example -- has always been controversial because it has been what scholars term "revisionist" -- reconsiderations, based on new research and perspectives, of prevailing interpretations of Soviet and post-Soviet Russian history. But the "controversy" surrounding me since 2014, mostly in reaction to the contents of this book, has been different -- inspired by usually vacuous, defamatory assaults on me as "Putin's No. 1 American Apologist," "Best Friend," and the like. I never respond specifically to these slurs because they offer no truly substantive criticism of my arguments, only ad hominem attacks. Instead, I argue, as readers will see in the first section, that I am a patriot of American national security, that the orthodox policies my assailants promote are gravely endangering our security, and that therefore we -- I and others they assail -- are patriotic heretics. Here too readers can judge."

Cohen, Stephen F.. War with Russia (Kindle Locations 131-139). Hot Books. Kindle Edition.

Professor Cohen is indeed a patriot of the highest order. The American and "Globalists" elites, particularly the dysfunctional United Kingdom, are engaging in a war of nerves with Russia. This war, which could turn nuclear for reasons discussed in this important book, is of no benefit to any person or nation.

Indeed, with the hysteria on "climate change" isn't it odd that other than Professor Cohen's voice, there are no prominent figures warning of the devastation that nuclear war would bring?

If you are a viewer of one of the legacy media outlets, be it Cable Television networks, with the exception of Tucker Carlson on Fox who has Professor Cohen as a frequent guest, or newspapers such as The New York Times, you have been exposed to falsehoods by remarkably ignorant individuals; ignorant of history, of the true nature of Russia (which defeated the Nazis in Europe at a loss of millions of lives) and most important, of actual military experience. America is neither an invincible or exceptional nation. And for those familiar with terminology of ancient history, it appears the so-called elites are suffering from hubris.

I cannot recommend Professor Cohen's work with sufficient superlatives; his arguments are erudite, clearly stated, supported by the facts and ultimately irrefutable. If enough people find Professor Cohen's work and raise their voices to their oblivious politicians and profiteers from war to stop further confrontation between Russia and America, then this book has served a noble purpose.

If nothing else, educate yourself by reading this work to discover what the *truth* is. And the truth is something sacred.

America and the world owe Professor Cohen a great debt. "Blessed are the peace makers..."

[Mar 31, 2019] George Nader (an adviser to the crown prince of Abu Dhab): Nobody would even waste a cup of coffee on him if it wasn't for who he was married to

Notable quotes:
"... She suggests, "Kushner was increasingly caught up in his own mythology. He was the president's son-in-law, so he apparently thought he was untouchable." (Pg. 114) She notes, "allowing Kushner to work in the administration broke with historical precedent, overruling a string of Justice Department memos that concluded it was illegal for presidents to appoint relatives as White House staff." (Pg. 119) ..."
"... She observes, "Those first few days were chaotic for almost everyone in the new administration. A frantic Reince Priebus would quickly discover that it was impossible to impose any kind of order in this White House, in large part because Trump didn't like order. What Trump liked was having people fight in front of him and then he'd make a decision, just like he'd made snap decisions when his children presented licensing deals for the Trump Organization. This kind of dysfunction enabled a 'floater' like Kushner, whose job was undefined, to weigh in on any topic in front of Trump and have far more influence than he would have had in a top-down hierarchy." (Pg. 125) ..."
Mar 31, 2019 | www.amazon.com

Steven H Propp TOP 50 REVIEWER 5.0 out of 5 stars March 27, 2019

AN INFORMATIVE BOOK ABOUT THE PRESIDENT'S DAUGHTER AND SON-IN-LAW

Author Vicky Ward wrote in the Prologue to this 2019 book, "Donald Trump was celebrating being sworn in as president And the whole world knew that his daughter and son-in-law were his most trusted advisers, ambassadors, and coconspirators. They were an attractive couple---extremely wealthy and, now, extraordinarily powerful. Ivanka looked like Cinderella Ivanka and her husband swept onto the stage, deftly deflecting attention from Donald Trump's clumsy moves, as she had done do often over the past twenty years. The crowd roared in approval They were now America's prince and princess."

She notes, "Jared Kushner learned about the company [his father's] he would later run. Jared was the firm's most sheltered trainee. On his summer vacations, he'd go to work at Kushner Companies construction sites, maybe painting a few walls, more often sitting and listening to music No one dared tell him this probably would not give him a deep understanding of the construction process. But Charlie [Jared's father] doggedly groomed his eldest son for greatness, seeing himself as a Jewish version of Joseph Kennedy " (Pg. 17-18)

She states, "Ivanka had to fight for her father's attention and her ultimate role as the chief heir in his real estate empire When Donald Trump divorced her mother, Ivana she would go out of her way to see more of her father, not less she'd call him during the day and to her delight, he'd always take her call. (Trump's relationship with the two sons he had with Ivana, Don Jr. and Eric, was not nearly so close for years.) 'She was always Daddy's little girl,' said a family friend." (Pg. 32-33) She adds, "As Ivanka matured, physically and emotionally, her father talked openly about how impressed he was with her appearance---a habit he has maintained to this day." (Pg. 35)

She recounts, "at a networking lunch thrown by a diamond heir Jared was introduced to Ivanka Jared and Ivanka quickly became an intriguing gossip column item. They seemed perfectly matched But after a year of dating, they split in part because Jared's parents were dismayed at the idea of their son marrying outside the faith Soon after, Ivanka agreed to convert to Judaism Trump was said to be discombobulated by the enormity of what his daughter had done. Trump, a Presbyterian, who strikes no one as particularly religious, was baffled by his daughter's conversion 'Why should my daughter convert to marry anyone?'" (Pg. 51-53)

She observes, "Ivanka Trump was critical in promoting her husband as the smoother, softer counterpart to his father's volatility.. they could both work a room, ask after people's children, talk without notes, occasionally fake a sense of humor And unlike her husband, she seemed to have a ready command of figures and a detail, working knowledge of all the properties she was involved in Ivanka seemed to control the marital relationship, but she also played the part of devoted, traditional Orthodox wife." (Pg. 70-71)

Of 2016, she states, "No one thought Kushner or Ivanka believed in Trump's populist platform. 'The two of them see this as a networking opportunity,' said a close associate. Because Kushner and Ivanka only fully immersed themselves in Trump's campaign once he became the presumptive Republican nominee they had to push to assert themselves with the campaign staff Kushner quickly got control of the campaign's budget, but he did not have as much authority as he would have liked." (Pg. 74-75) She adds, "Ivanka appeared thrilled by her husband's rising prominence in her father's campaign. It was a huge change from the days when Trump had made belittling jokes about him. If Don Jr. and Eric were irked by the new favorite in Trump's court, they did not show it publicly." (Pg. 85)

She points out, "Trump tweeted an image [Hillary with a backdrop of money and a Star of David] widely viewed as anti-Semitic an 'Observer' writer, criticized Kushner in his own newspaper for standing 'silent and smiling in the background' while Trump made 'repeated accidental winks' to white supremacists Kushner wrote a response [that] insisted that Trump was neither anti-Semitic nor a racist Not all of Kushner's relatives appreciated his efforts to cover Trump's pandering to white supremacists." (Pg. 86-87) Later, she adds, "U.S.-Israel relations was the one political issue anyone in the campaign ever saw Kushner get worked up about." (Pg. 96)

On election night, "Kushner was shocked that Trump never mentioned him in his speech and would later tell people he felt slighted. He was going to find a way to get Trump to notice him more. Ivanka would help him the couple would become known as a single, powerful entity: 'Javanka.'" (Pg. 101) She suggests, "Kushner was increasingly caught up in his own mythology. He was the president's son-in-law, so he apparently thought he was untouchable." (Pg. 114) She notes, "allowing Kushner to work in the administration broke with historical precedent, overruling a string of Justice Department memos that concluded it was illegal for presidents to appoint relatives as White House staff." (Pg. 119)

She observes, "Those first few days were chaotic for almost everyone in the new administration. A frantic Reince Priebus would quickly discover that it was impossible to impose any kind of order in this White House, in large part because Trump didn't like order. What Trump liked was having people fight in front of him and then he'd make a decision, just like he'd made snap decisions when his children presented licensing deals for the Trump Organization. This kind of dysfunction enabled a 'floater' like Kushner, whose job was undefined, to weigh in on any topic in front of Trump and have far more influence than he would have had in a top-down hierarchy." (Pg. 125)

She recounts, "Another epic [Steve] Bannon/Ivanka fight came when bannon was in the Oval Office dining room while Trump was watching TV and eating his lunch Ivanka marched in, claiming Bannon had leaked H.R. McMaster's war plan [Bannon said] 'No, that was leaked by McMaster ' Trump [told her], 'Hey, baby, I think Steve's right on this one ' Bannon thought he would be fired on the spot. But he'd learned something important: much as Trump loved his daughter and hated saying no to her, he was not always controlled by her." (Pg. 138-139)

She notes, "[Ivanka] also found a way to be near Trump when he received phone calls from foreign dignitaries -- while she still owned her business. While Ivanka's behavior was irritating, Kushner was playing a game on a whole different level: he was playing for serious money at the time of the Qatari blockade Kushner's family had been courting the Qataris for financial help and had been turned town. When that story broke the blockade and the Trump administration's response to it suddenly all made sense." (Pg. 156)

Arguing that "Kushner was behind the decision to fire [FBI Director James] Comey" (Pg. 163-164), "Quickly, Trump realized he'd made an error, and blamed Kushner. It seemed clear to Trump's advisers, and not for the first time, that he wished Kushner were not in the White House. He said to Kushner in front of senior staff, 'Just go back to New York, man '" (Pg. 167) She adds, "[Ivanka's] reluctance to speak frankly to her father was the antithesis of the story she had been pushing in the media Ivanka had told Gayle King 'Where I disagree with my father, he knows it. And I express myself with total candor.'" (Pg. 170)

She states, "at the Group of 20 summit in Germany she briefly took her father's seat when he had to step out The gesture seemed to send the message that the U.S. government was now run on nepotism." (Pg. 182)

E-mails from George Nader [an adviser to Shiekh Mohammed bin Zayed Al Nahyan, the crown prince of Abu Dhabi] "made it clear that Kushner's friends in the Gulf mocked him behind his back Nader wrote 'Nobody would even waste a cup of coffee on him if it wasn't for who he was married to.'" (Pg. 206)

She points out, "since October 2017, hundreds of children had been taken from their parents while attempting to cross the U.S.-Mexico border and detained separately news shows everywhere showed heartbreaking images of young children being detained. The next month, Ivanka posted on Instagram a photograph of herself holding her youngest child in his pajamas. Not for the first time, her tone-deaf social media post was slammed as being isolated in her elitist, insulated wealthy world On June 20, Trump signed an executive order that apparently ended the border separations. Minutes later, Ivanka finally spoke publicly on the issue Her tactic here was tell the public you care about an issue; watch silently while your father does the exact opposite; and when he moves a little, take all the credit." (Pg. 225)

She asserts, "Kushner's friendship with a Saudi crown prince was now under widespread scrutiny [because] Rather than expressing moral outrage over the cold-blooded murder of an innocent man [Saudi journalist Jamal Khashoggi], Kushner did what he always does in a crisis: he went quiet." (Pg. 232)

She concludes, "Ivanka Trump has made no secret of the fact that she wants to be the most powerful woman in the world. Her father's reign in Washington, D.C., is, she believes, the beginning of a great American dynasty Ivanka has been carefully positioning herself as [Trump's] political heir " (Pg. 236)

While not as "scandalous" as the book's subtitle might suggest, this is a very interesting book that will be of great interest to those wanting information about these crucial members of the Trump family and presidency.

[Mar 28, 2019] Was MAGA is con job ?

Notable quotes:
"... Until the Crash of the Great Recession, after which we entered a "Punitive" stage, blaming "Those Others" for buying into faulty housing deals, for wanting a safety net of health care insurance, for resurgent terrorism beyond our borders, and, as the article above indicates, for having an equal citizen's voice in the electoral process. ..."
"... What needs to be restored is the purpose that "the economy works for the PEOPLE of the nation", not the other way around, as we've witnessed for the last four decades. ..."
Feb 26, 2019 | www.amazon.com

Kindle Customer, December 8, 2018

5.0 out of 5 stars How and Why the MAGA-myth Consumed Itself

Just finished reading this excellent book on how corporatist NeoLiberalism and the Xristianists merged their ideologies to form the Conservative Coalition in the 1970s, and to then hijack the RepubliCAN party of Abe, Teddy, Ike (and Poppy Bush).

The author describes three phases of the RepugliCONs' zero-sum game:

The "Combative" stage of Reagan sought to restore "family values" (aka patriarchal hierarchy) to the moral depravity of Sixties youth and the uppity claims to equal rights by blacks and feminists.

In the "Normative" stage of Gingrich and W Bush, the NeoConservatives claimed victory over Godless Communism and the NeoLibs took credit for an expanding economy (due mostly by technology, not to Fed policy). They were happy to say "Aren't you happy now?" with sole ownership of the Free World and its markets, yet ignoring various Black Swan events and global trends they actually had no control over.

Until the Crash of the Great Recession, after which we entered a "Punitive" stage, blaming "Those Others" for buying into faulty housing deals, for wanting a safety net of health care insurance, for resurgent terrorism beyond our borders, and, as the article above indicates, for having an equal citizen's voice in the electoral process.

What was unexpected was that the libertarian mutiny by the TeaParty would become so nasty and vicious, leading to the Pirate Trump to scavenge what little was left of American Democracy for his own treasure.

What needs to be restored is the purpose that "the economy works for the PEOPLE of the nation", not the other way around, as we've witnessed for the last four decades.

[Feb 17, 2019] Death of the Public University Uncertain Futures for Higher Education in the Knowledge Economy (Higher Education

Notable quotes:
"... Administration bloat and academic decline is another prominent feature of the neoliberal university. University presidents now view themselves as CEO and want similar salaries. ..."
Feb 17, 2019 | www.amazon.com

Customer Review

skeptic 5.0 out of 5 stars February 11, 2019 Format: Kindle Edition

The eyes opening, very important for any student or educator book

This book is the collection of more than dozen of essays of various authors, but even the Introduction (Privatizing the Public University: Key Trends, Countertrends, and Alternatives) is worth the price of the book

Trends in neo-liberalization of university education are not new. But recently they took a more dangerous turn. And they are not easy to decipher, despite the fact that they are greatly affect the life of each student or educator. In this sense this is really an eyes-opening book.

In Europe previously higher education as assessable for free or almost free, but for talented student only. Admission criteria were strict and checked via written and oral entrance exams on key subjects. Now the tend is to view university as business that get customers, charge them exorbitant fees and those customers get diploma as hamburgers in McDonalds at the end for their money. Whether those degree are worth money charged, or not and were suitable for the particular student of not (many are "fake" degrees with little or no chances for getting employment) is not university business. On the contrary marketing is used to attract as many students as possible and many of those student now remain in debt for large part of their adult life.

In other words, the neoliberalization of the university in the USA creates new, now dominant trend -- the conversion of the university into for-profit diploma mills, which are essentially a new type of rent-seeking (and they even attract speculative financial capital and open scamsters, like was in case of "Trump University" ). Even old universities with more than a century history more and more resemble diploma mills.

This assault on academic freedom by neoliberalism justifies itself by calling for "transparency" and "accountability" to the taxpayer and the public. But it operates used utter perversion of those terms. In the Neoliberal context, they mean "total surveillance" and "rampant rent-seeking."

Neoliberalism has converted education from a public good to a personal investment in the future, a future conceived in terms of earning capacity. As this is about your future earning potential, it is logical that for a chance to increase it you need to take a loan.

Significantly, in the same period per capita, spending on prisons increased by 126 percent (Newfield 2008: 266). Between the 1970s and 1990s there was a 400 percent increase in charges in tuition, room, and board in U.S. universities and tuition costs have grown at about ten times the rate of family income (ibid.). What these instances highlight is not just the state's retreat from direct funding of higher education but also a calculated initiative to enable private companies to capture and profit from tax-funded student loans.

The other tendency is also alarming. Funds now are allocated to those institutions that performed best in what has become a fetishistic quest for ever-higher ratings. That creates the 'rankings arms-race.' It has very little or nothing to do with the quality of teaching of students in a particular university. On the contrary, the curriculums were "streamlines" and "ideologically charged courses" such as neoclassical economics are now required for graduation even in STEM specialties.

In the neoliberal university professors are now under the iron heel of management and various metrics were invented to measure the "quality of teaching." Most of them are very perverted or can be perverted as when a measurement becomes a target; teachers start to focus their resources and activities primarily on what 'counts' rather than on their wider competencies, professional ethics and societal goals (see Kohn and Shore, this volume).

Administration bloat and academic decline is another prominent feature of the neoliberal university. University presidents now view themselves as CEO and want similar salaries. The same is true for the growing staff of university administrators. The recruitment of administrators has far outpaced the growth in the number of faculty – or even students. Meanwhile, universities claim to be struggling with budget crises that force to reduce permanent academic posts, and widely use underpaid and overworked adjunct staff – the 'precariat' paid just a couple of thousand dollars per course and often existing on the edge of poverty, or in real poverty.

Money now is the key objective and the mission changed from cultural to "for profit" business including vast expenses on advancement of the prestige and competitiveness of the university as an end in itself. Ability to get grants is now an important criteria of getting the tenure.

[Jan 14, 2019] Spygate: The Attempted Sabotage of Donald J. Trump

Notable quotes:
"... Elections are just for show like many trials in the old USSR. The in power Party is the power NOT the individual voting citizens. In the end this book is about exposing the pernicious activities of those who would place themselves above the voting citizens of America. ..."
Jan 14, 2019 | www.amazon.com

Johnny G 5.0 out of 5 stars The Complex Made Easy! October 9, 2018 Format: Hardcover Verified Purchase

Regardless of your politics this is a must read book. The authors do a wonderful job of peeling back the layered onion that is being referred to as "Spy Gate." The book reads like an imaginative spy thriller. Except it is as real a fist in the stomach or the death of your best friend. In this case it is our Constitution that is victimized by individuals entrusted with "protecting and defending it from all enemies DOMESTIC and foreign."

Tis is in many ways a sad tail of ambition, weak men, political operatives & hubris ridden bureaucrats. The end result IF this type of activity is not punished and roundly condemned by ALL Americans could be a descent into Solzhenitsyn's GULAG type of Deep State government run by unaccountable political appointees and bureaucrats.

Elections are just for show like many trials in the old USSR. The in power Party is the power NOT the individual voting citizens. In the end this book is about exposing the pernicious activities of those who would place themselves above the voting citizens of America. ALL Americans should be aware of those forces seen and unseen that seek to injure our Constitutional Republic. This book is footnoted extensively lest anyone believes it is a polemic political offering.

JAK 5.0 out of 5 stars The truth hurts and that's the truth October 11, 2018 Format: Hardcover Verified Purchase

This book has content that you will not see or find anywhere else. while the topic itself is covered elsewhere in large mainstream media Outlets the truth of what is actually happening is rarely ever exposed.

If there was a six-star recommendation or anything higher because the truth is all that matters, he would receive it.

This book is put together with so many far-left (CNN, BLOOMBERG, DLSTE, YAHOO ECT) leading news stories as being able to support the fact of what happened, it's possible to say oh well that just didn't happen but it was reported by the left and when you put all of the pieces of the puzzle together it is painfully obvious to see what happened......

If these people involved don't go to jail the death of our Republic has already happened

[Dec 12, 2018] The Neoliberal Agenda and the Student Debt Crisis in U.S. Higher Education (Routledge Studies in Education)

Notable quotes:
"... Neoliberalism's presence in higher education is making matters worse for students and the student debt crisis, not better. ..."
"... Cannan and Shumar (2008) focus their attention on resisting, transforming, and dismantling the neoliberal paradigm in higher education. They ask how can market-based reform serve as the solution to the problem neoliberal practices and policies have engineered? ..."
"... What got us to where we are (escalating tuition costs, declining state monies, and increasing neoliberal influence in higher education) cannot get us out of the SI.4 trillion problem. And yet this metaphor may, in fact, be more apropos than most of us on the right, left, or center are as yet seeing because we mistakenly assume the market we have is the only or best one possible. ..."
"... We only have to realize that the emperor has no clothes and reveal this reality. ..."
"... Indeed, the approach our money-dependent and money-driven legislators and policymakers have employed has been neoliberal in form and function, and it will continue to be so unless we help them to see the light or get out of the way. This book focuses on the $1.4+ trillion student debt crisis in the United States. It doesn't share hard and fast solutions per se. ..."
"... In 2011-2012, 50% of bachelor's degree recipients from for-profit institutions borrowed more than $40,000 and about 28% of associate degree recipients from for-profit institutions borrowed more than $30,000 (College Board, 2015a). ..."
Dec 12, 2018 | www.amazon.com

Despite tthe fact that necoliberalism brings poor economic growth, inadequate availability of jobs and career opportunities, and the concentration of economic and social rewards in the hands of a privileged upper class resistance to it, espcially at universities, remain weak to non-existant.

The first sign of high levels of dissatisfaction with neoliberalism was the election of Trump (who, of course, betrayed all his elections promises, much like Obma before him). As a result, the legitimation of neoliberalism based on references to the efficient
and effective functioning of the market (ideological legitimation) is
exhausted while wealth redistribution practices (material legitimation) are
not practiced and, in fact, considered unacceptable.

Despite these problems, resistance to neoliberalism remains weak.
Strategics and actions of opposition have been shifted from the sphere of
labor to that of the market creating a situation in which the idea of the
superiority and desirability of the market is shared by dominant and
oppositional groups alike. Even emancipatory movements such as women,
race, ethnicity, and sexual orientation have espoused individualistic,
competition-centered, and meritocratic views typical of ncolibcral dis-
courses. Moreover, corporate forces have colonized spaces and discourses
that have traditionally been employed by oppositional groups and move-
ments. However, as systemic instability' continues and capital accumulation
needs to be achieved, change is necessary. Given the weakness of opposi-
tion, this change is led by corporate forces that will continue to further
their interests but will also attempt to mitigate socio-economic contra-
dictions. The unavailability of ideological mechanisms to legitimize
ncolibcral arrangements will motivate dominant social actors to make
marginal concessions (material legitimation) to subordinate groups. These
changes, however, will not alter the corporate co-optation and distortion of
discourses that historically defined left-leaning opposition. As contradic-
tions continue, however, their unsustainability will represent a real, albeit
difficult, possibility for anti-neoliberal aggregation and substantive change.

Connolly (2016) reported that a poll shows that some graduated student loan borrowers would willingly go to extremes to pay off outstanding student debt. Those extremes include experiencing physical pain and suffering and even a reduced lifespan. For instance, 35% of those polled would take one year off life expectancy and 6.5% would willingly cut off their pinky finger if it meant ridding themselves of the student loan debt they currently held.

Neoliberalism's presence in higher education is making matters worse for students and the student debt crisis, not better. In their book Structure and Agency in the Neoliberal University, Cannan and Shumar (2008) focus their attention on resisting, transforming, and dismantling the neoliberal paradigm in higher education. They ask how can market-based reform serve as the solution to the problem neoliberal practices and policies have engineered?

It is like an individual who loses his keys at night and who decides to look only beneath the street light. This may be convenient because there is light, but it might not be where the keys are located. This metaphorical example could relate to the student debt crisis. What got us to where we are (escalating tuition costs, declining state monies, and increasing neoliberal influence in higher education) cannot get us out of the SI.4 trillion problem. And yet this metaphor may, in fact, be more apropos than most of us on the right, left, or center are as yet seeing because we mistakenly assume the market we have is the only or best one possible.

As Lucille (this volume) strives to expose, the systemic cause of our problem is "hidden in plain sight," right there in the street light for all who look carefully enough to see. We only have to realize that the emperor has no clothes and reveal this reality. If and when a critical mass of us do, systemic change in our monetary exchange relations can and, we hope, will become our funnel toward a sustainable and socially, economically, and ecologically just future where public education and democracy can finally become realities rather than merely ideals.

Indeed, the approach our money-dependent and money-driven legislators and policymakers have employed has been neoliberal in form and function, and it will continue to be so unless we help them to see the light or get out of the way. This book focuses on the $1.4+ trillion student debt crisis in the United States. It doesn't share hard and fast solutions per se. Rather, it addresses real questions (and their real consequences). Are collegians overestimating the economic value of going to college?

What are we, they, and our so-called elected leaders failing or refusing to sec and why? This critically minded, soul-searching volume shares territory with, yet pushes beyond, that of Akers and Chingos (2016), Baum (2016), Goldrick-Rab (2016), Graebcr (2011), and Johannscn (2016) in ways that we trust those critically minded authors -- and others concerned with our mess of debts, public and private, and unfulfilled human potential -- will find enlightening and even ground-breaking.

... ... ...

In the meantime, college costs have significantly increased over the past fifty years. The average cost of tuition and fees (excluding room and board) for public four-year institutions for a full year has increased from 52,387 (in 2015 dollars) for the 1975-1976 academic year, to 59,410 for 2015-2016. The tuition for public two-year colleges averaged $1,079 in 1975-1976 (in 2015 dollars) and increased to $3,435 for 2015-2016. At private non-profit four-year institutions, the average 1975-1976 cost of tuition and fees (excluding room and board) was $10,088 (in 2015 dollars), which increased to $32,405 for 2015-2016 (College Board, 2015b).

The purchasing power of Pell Grants has decreased. In fact, the maximum Pell Grants coverage of public four-year tuition and fees decreased from 83% in 1995-1996 to 61% in 2015-2016. The maximum Pell Grants coverage of private non-profit four-year tuition and fees decreased from 19% in 1995-1996 to 18% in 2015-2016 (College Board, 2015a).

... ... ....

... In 2013-2014, 61% of bachelor's degree recipients from public and private non-profit four-year institutions graduated with an average debt of $16,300 per graduate. In 2011-2012, 50% of bachelor's degree recipients from for-profit institutions borrowed more than $40,000 and about 28% of associate degree recipients from for-profit institutions borrowed more than $30,000 (College Board, 2015a).

Rising student debt has become a key issue of higher education finance among many policymakers and researchers. Recently, the government has implemented a series of measures to address student debt. In 2005, the Bankruptcy Abuse Prevention and Consumer Protection Act (2005) was passed, which barred the discharge of all student loans through bankruptcy for most borrowers (Collinge, 2009). This was the final nail in the bankruptcy coffin, which had begun in 1976 with a five-year ban on student loan debt (SLD) bankruptcy and was extended to seven years in 1990. Then in 1998, it became a permanent ban for all who could not clear a relatively high bar of undue hardship (Best 6c Best, 2014).

By 2006, Sallie Mae had become the nation's largest private student loan lender, reporting loan holdings of $123 billion. Its fee income collected from defaulted loans grew from $280 million in 2000 to $920 million in 2005 (Collinge, 2009). In 2007, in response to growing student default rates, the College Cost Reduction Act was passed to provide loan forgiveness for student loan borrowers who work full-time in a public service job. The Federal Direct Loan will be forgiven after 120 payments were made. This Act also provided other benefits for students to pay for their postsecondary education, such as lowering interest rates of GSL, increasing the maximum amount of Pell Grant (though, as noted above, not sufficiently to meet rising tuition rates), as well as reducing guarantor collection fees (Collinge, 2009).

In 2008, the Higher Education Opportunity Act (2008) was passed to increase transparency and accountability. This Act required institutions that are participating in federal financial aid programs to post a college price calculator on their websites in order to provide better college cost information for students and families (U.S. Department of Education |U.S. DoE|, 2015a). Due to the recession of 2008, the American Opportunity Tax Credit of 2009 (AOTC) was passed to expand the Hope Tax Credit program, in which the amount of tax credit increased to 100% for the first $2,000 of qualified educational expenses and was reduced to 25% of the second $2,000 in college expenses. The total credit cap increased from $1,500 to $2,500 per student. As a result, the federal spending on education tax benefits had a large increase since then (Crandall-Hollick, 2014), benefits that, again, are reaped only by those who file income taxes.

[Nov 05, 2018] How neoliberals destroyed University education and then a large part of the US middle class and the US postwar social order by Edward Qualtrough

Notable quotes:
"... Every academic critique of neoliberalism is an unacknowledged memoir. We academics occupy a crucial node in the neoliberal system. Our institutions are foundational to neoliberalism's claim to be a meritocracy, insofar as we are tasked with discerning and certifying the merit that leads to the most powerful and desirable jobs. Yet at the same time, colleges and universities have suffered the fate of all public goods under the neoliberal order. We must therefore "do more with less," cutting costs while meeting ever-greater demands. The academic workforce faces increasing precarity and shrinking wages even as it is called on to teach and assess more students than ever before in human history -- and to demonstrate that we are doing so better than ever, via newly devised regimes of outcome-based assessment. In short, we academics live out the contradictions of neoliberalism every day. ..."
"... Whereas classical liberalism insisted that capitalism had to be allowed free rein within its sphere, under neoliberalism capitalism no longer has a set sphere. We are always "on the clock," always accruing (or squandering) various forms of financial and social capital. ..."
Aug 24, 2016 | www.amazon.com

From: Amazon.com Neoliberalism's Demons On the Political Theology of Late Capital (9781503607125) Adam Kotsko Books

Every academic critique of neoliberalism is an unacknowledged memoir. We academics occupy a crucial node in the neoliberal system. Our institutions are foundational to neoliberalism's claim to be a meritocracy, insofar as we are tasked with discerning and certifying the merit that leads to the most powerful and desirable jobs. Yet at the same time, colleges and universities have suffered the fate of all public goods under the neoliberal order. We must therefore "do more with less," cutting costs while meeting ever-greater demands. The academic workforce faces increasing precarity and shrinking wages even as it is called on to teach and assess more students than ever before in human history -- and to demonstrate that we are doing so better than ever, via newly devised regimes of outcome-based assessment. In short, we academics live out the contradictions of neoliberalism every day.

... ... ...

On a more personal level it reflects my upbringing in the suburbs of Flint, Michigan, a city that has been utterly devastated by the transition to neoliberalism. As I lived through the slow-motion disaster of the gradual withdrawal of the auto industry, I often heard Henry Ford s dictum that a company could make more money if the workers were paid enough to be customers as well, a principle that the major US automakers were inexplicably abandoning. Hence I find it [Fordism -- NNB] to be an elegant way of capturing the postwar model's promise of creating broadly shared prosperity by retooling capitalism to produce a consumer society characterized by a growing middle class -- and of emphasizing the fact that that promise was ultimately broken.

By the mid-1970s, the postwar Fordist order had begun to breakdown to varying degrees in the major Western countries. While many powerful groups advocated a response to the crisis that would strengthen the welfare state, the agenda that wound up carrying the day was neoliberalism, which was most forcefully implemented in the United Kingdom by Margaret Thatcher and in the United States by Ronald Reagan. And although this transformation was begun by the conservative part)', in both countries the left-of-centcr or (in American usage) "liberal"party wound up embracing neoliberal tenets under Tony Blair and Bill Clinton, ostensibly for the purpose of directing them toward progressive ends.

With the context of current debates within the US Democratic Party, this means that Clinton acolytes are correct to claim that "neoliberalism" just is liberalism but only to the extent that, in the contemporary United States, the term liberalism is little more than a word for whatever the policy agenda of the Democratic Party happens to be at any given time. Though politicians of all stripes at times used libertarian rhetoric to sell their policies, the most clear-eyed advocates of neoliberalism realized that there could be no simple question of a "return" to the laissez-faire model.

Rather than simply getting the state "out of the way," they both deployed and transformed state power, including the institutions of the welfare state, to reshape society in accordance with market models. In some cases creating markets where none had previously existed, as in the privatization of education and other public services. In others it took the form of a more general spread of a competitive market ethos into ever more areas of life -- so that we are encouraged to think of our reputation as a "brand," for instance, or our social contacts as fodder for "networking." Whereas classical liberalism insisted that capitalism had to be allowed free rein within its sphere, under neoliberalism capitalism no longer has a set sphere. We are always "on the clock," always accruing (or squandering) various forms of financial and social capital.

[Mar 19, 2018] PyCharm - Python IDE Full Review

An increasingly popular installation method: "snap install pycharm-community --classic".
Mar 19, 2018 | www.linuxandubuntu.com

​Pycharm is a powerful Integrated Development Environment that can be used to develop Python applications, web apps, and even data analysis tools. Pycharm has everything a python developer needs to develop. The IDE is full of surprises and keyboard shortcuts that will leave you impressed and at the same time satisfied that your projects are completed on time. Good work from JetBrains. Couldn't have done any better.

[Dec 16, 2017] 3. Data model -- Python 3.6.4rc1 documentation

Notable quotes:
"... __slots__ ..."
"... Note that the current implementation only supports function attributes on user-defined functions. Function attributes on built-in functions may be supported in the future. ..."
"... generator function ..."
"... coroutine function ..."
"... asynchronous generator function ..."
"... operator overloading ..."
"... __init_subclass__ ..."
"... context manager ..."
"... asynchronous iterable ..."
"... asynchronous iterator ..."
"... asynchronous iterator ..."
"... asynchronous context manager ..."
"... context manager ..."
Dec 16, 2017 | docs.python.org
Table Of Contents Previous topic

3. Data model

3.1. Objects, values and types

Objects are Python's abstraction for data. All data in a Python program is represented by objects or by relations between objects. (In a sense, and in conformance to Von Neumann's model of a "stored program computer," code is also represented by objects.)

Every object has an identity, a type and a value. An object's identity never changes once it has been created; you may think of it as the object's address in memory. The ' is ' operator compares the identity of two objects; the id() function returns an integer representing its identity.

CPython implementation detail: For CPython, id(x) is the memory address where is stored.

An object's type determines the operations that the object supports (e.g., "does it have a length?") and also defines the possible values for objects of that type. The type() function returns an object's type (which is an object itself). Like its identity, an object's type is also unchangeable. [1]

The value of some objects can change. Objects whose value can change are said to be mutable ; objects whose value is unchangeable once they are created are called immutable . (The value of an immutable container object that contains a reference to a mutable object can change when the latter's value is changed; however the container is still considered immutable, because the collection of objects it contains cannot be changed. So, immutability is not strictly the same as having an unchangeable value, it is more subtle.) An object's mutability is determined by its type; for instance, numbers, strings and tuples are immutable, while dictionaries and lists are mutable.

Objects are never explicitly destroyed; however, when they become unreachable they may be garbage-collected. An implementation is allowed to postpone garbage collection or omit it altogether -- it is a matter of implementation quality how garbage collection is implemented, as long as no objects are collected that are still reachable.

CPython implementation detail: CPython currently uses a reference-counting scheme with (optional) delayed detection of cyclically linked garbage, which collects most objects as soon as they become unreachable, but is not guaranteed to collect garbage containing circular references. See the documentation of the gc module for information on controlling the collection of cyclic garbage. Other implementations act differently and CPython may change. Do not depend on immediate finalization of objects when they become unreachable (so you should always close files explicitly).

Note that the use of the implementation's tracing or debugging facilities may keep objects alive that would normally be collectable. Also note that catching an exception with a ' try except ' statement may keep objects alive.

Some objects contain references to "external" resources such as open files or windows. It is understood that these resources are freed when the object is garbage-collected, but since garbage collection is not guaranteed to happen, such objects also provide an explicit way to release the external resource, usually a close() method. Programs are strongly recommended to explicitly close such objects. The ' try finally ' statement and the ' with ' statement provide convenient ways to do this.

Some objects contain references to other objects; these are called containers . Examples of containers are tuples, lists and dictionaries. The references are part of a container's value. In most cases, when we talk about the value of a container, we imply the values, not the identities of the contained objects; however, when we talk about the mutability of a container, only the identities of the immediately contained objects are implied. So, if an immutable container (like a tuple) contains a reference to a mutable object, its value changes if that mutable object is changed.

Types affect almost all aspects of object behavior. Even the importance of object identity is affected in some sense: for immutable types, operations that compute new values may actually return a reference to any existing object with the same type and value, while for mutable objects this is not allowed. E.g., after 1; , and may or may not refer to the same object with the value one, depending on the implementation, but after []; [] , and are guaranteed to refer to two different, unique, newly created empty lists. (Note that [] assigns the same object to both and .) 3.2. The standard type hierarchy

Below is a list of the types that are built into Python. Extension modules (written in C, Java, or other languages, depending on the implementation) can define additional types. Future versions of Python may add types to the type hierarchy (e.g., rational numbers, efficiently stored arrays of integers, etc.), although such additions will often be provided via the standard library instead.

Some of the type descriptions below contain a paragraph listing 'special attributes.' These are attributes that provide access to the implementation and are not intended for general use. Their definition may change in the future.

None

This type has a single value. There is a single object with this value. This object is accessed through the built-in name None . It is used to signify the absence of a value in many situations, e.g., it is returned from functions that don't explicitly return anything. Its truth value is false.

NotImplemented

This type has a single value. There is a single object with this value. This object is accessed through the built-in name NotImplemented . Numeric methods and rich comparison methods should return this value if they do not implement the operation for the operands provided. (The interpreter will then try the reflected operation, or some other fallback, depending on the operator.) Its truth value is true.

See Implementing the arithmetic operations for more details.

Ellipsis

This type has a single value. There is a single object with this value. This object is accessed through the literal ... or the built-in name Ellipsis . Its truth value is true.

numbers.Number

These are created by numeric literals and returned as results by arithmetic operators and arithmetic built-in functions. Numeric objects are immutable; once created their value never changes. Python numbers are of course strongly related to mathematical numbers, but subject to the limitations of numerical representation in computers.

Python distinguishes between integers, floating point numbers, and complex numbers:

numbers.Integral

These represent elements from the mathematical set of integers (positive and negative).

There are two types of integers:

Integers ( int )

These represent numbers in an unlimited range, subject to available (virtual) memory only. For the purpose of shift and mask operations, a binary representation is assumed, and negative numbers are represented in a variant of 2's complement which gives the illusion of an infinite string of sign bits extending to the left.
Booleans ( bool )

These represent the truth values False and True. The two objects representing the values False and True are the only Boolean objects. The Boolean type is a subtype of the integer type, and Boolean values behave like the values 0 and 1, respectively, in almost all contexts, the exception being that when converted to a string, the strings "False" or "True" are returned, respectively.

The rules for integer representation are intended to give the most meaningful interpretation of shift and mask operations involving negative integers.

numbers.Real ( float )

These represent machine-level double precision floating point numbers. You are at the mercy of the underlying machine architecture (and C or Java implementation) for the accepted range and handling of overflow. Python does not support single-precision floating point numbers; the savings in processor and memory usage that are usually the reason for using these are dwarfed by the overhead of using objects in Python, so there is no reason to complicate the language with two kinds of floating point numbers.

numbers.Complex ( complex )

These represent complex numbers as a pair of machine-level double precision floating point numbers. The same caveats apply as for floating point numbers. The real and imaginary parts of a complex number can be retrieved through the read-only attributes z.real and z.imag .

Sequences

These represent finite ordered sets indexed by non-negative numbers. The built-in function len() returns the number of items of a sequence. When the length of a sequence is n , the index set contains the numbers 0, 1, , n -1. Item i of sequence a is selected by a[i] .

Sequences also support slicing: a[i:j] selects all items with index k such that i <= k < j . When used as an expression, a slice is a sequence of the same type. This implies that the index set is renumbered so that it starts at 0.

Some sequences also support "extended slicing" with a third "step" parameter: a[i:j:k] selects all items of a with index x where n*k , n >= and i <= x < j .

Sequences are distinguished according to their mutability:

Immutable sequences

An object of an immutable sequence type cannot change once it is created. (If the object contains references to other objects, these other objects may be mutable and may be changed; however, the collection of objects directly referenced by an immutable object cannot change.)

The following types are immutable sequences:

Strings

A string is a sequence of values that represent Unicode code points. All the code points in the range U+0000 U+10FFFF can be represented in a string. Python doesn't have a char type; instead, every code point in the string is represented as a string object with length . The built-in function ord() converts a code point from its string form to an integer in the range 10FFFF ; chr() converts an integer in the range 10FFFF to the corresponding length string object. str.encode() can be used to convert a str to bytes using the given text encoding, and bytes.decode() can be used to achieve the opposite.

Tuples

The items of a tuple are arbitrary Python objects. Tuples of two or more items are formed by comma-separated lists of expressions. A tuple of one item (a 'singleton') can be formed by affixing a comma to an expression (an expression by itself does not create a tuple, since parentheses must be usable for grouping of expressions). An empty tuple can be formed by an empty pair of parentheses.

Bytes

A bytes object is an immutable array. The items are 8-bit bytes, represented by integers in the range 0 <= x < 256. Bytes literals (like b'abc' ) and the built-in bytes() constructor can be used to create bytes objects. Also, bytes objects can be decoded to strings via the decode() method.

Mutable sequences

Mutable sequences can be changed after they are created. The subscription and slicing notations can be used as the target of assignment and del (delete) statements.

There are currently two intrinsic mutable sequence types:

Lists

The items of a list are arbitrary Python objects. Lists are formed by placing a comma-separated list of expressions in square brackets. (Note that there are no special cases needed to form lists of length 0 or 1.)

Byte Arrays

A bytearray object is a mutable array. They are created by the built-in bytearray() constructor. Aside from being mutable (and hence unhashable), byte arrays otherwise provide the same interface and functionality as immutable bytes objects.

The extension module array provides an additional example of a mutable sequence type, as does the collections module.

Set types

These represent unordered, finite sets of unique, immutable objects. As such, they cannot be indexed by any subscript. However, they can be iterated over, and the built-in function len() returns the number of items in a set. Common uses for sets are fast membership testing, removing duplicates from a sequence, and computing mathematical operations such as intersection, union, difference, and symmetric difference.

For set elements, the same immutability rules apply as for dictionary keys. Note that numeric types obey the normal rules for numeric comparison: if two numbers compare equal (e.g., and 1.0 ), only one of them can be contained in a set.

There are currently two intrinsic set types:

Sets

These represent a mutable set. They are created by the built-in set() constructor and can be modified afterwards by several methods, such as add() .

Frozen sets

These represent an immutable set. They are created by the built-in frozenset() constructor. As a frozenset is immutable and hashable , it can be used again as an element of another set, or as a dictionary key.

Mappings

These represent finite sets of objects indexed by arbitrary index sets. The subscript notation a[k] selects the item indexed by from the mapping ; this can be used in expressions and as the target of assignments or del statements. The built-in function len() returns the number of items in a mapping.

There is currently a single intrinsic mapping type:

Dictionaries

These represent finite sets of objects indexed by nearly arbitrary values. The only types of values not acceptable as keys are values containing lists or dictionaries or other mutable types that are compared by value rather than by object identity, the reason being that the efficient implementation of dictionaries requires a key's hash value to remain constant. Numeric types used for keys obey the normal rules for numeric comparison: if two numbers compare equal (e.g., and 1.0 ) then they can be used interchangeably to index the same dictionary entry.

Dictionaries are mutable; they can be created by the {...} notation (see section Dictionary displays ).

The extension modules dbm.ndbm and dbm.gnu provide additional examples of mapping types, as does the collections module.

Callable types

These are the types to which the function call operation (see section Calls ) can be applied:

User-defined functions

A user-defined function object is created by a function definition (see section Function definitions ). It should be called with an argument list containing the same number of items as the function's formal parameter list.

Special attributes:

Attribute Meaning
__doc__ The function's documentation string, or None if unavailable; not inherited by subclasses Writable
__name__ The function's name Writable
__qualname__

The function's qualified name

New in version 3.3.
Writable
__module__ The name of the module the function was defined in, or None if unavailable. Writable
__defaults__ A tuple containing default argument values for those arguments that have defaults, or None if no arguments have a default value Writable
__code__ The code object representing the compiled function body. Writable
__globals__ A reference to the dictionary that holds the function's global variables -- the global namespace of the module in which the function was defined. Read-only
__dict__ The namespace supporting arbitrary function attributes. Writable
__closure__ None or a tuple of cells that contain bindings for the function's free variables. Read-only
__annotations__ A dict containing annotations of parameters. The keys of the dict are the parameter names, and 'return' for the return annotation, if provided. Writable
__kwdefaults__ A dict containing defaults for keyword-only parameters. Writable

Most of the attributes labelled "Writable" check the type of the assigned value.

Function objects also support getting and setting arbitrary attributes, which can be used, for example, to attach metadata to functions. Regular attribute dot-notation is used to get and set such attributes. Note that the current implementation only supports function attributes on user-defined functions. Function attributes on built-in functions may be supported in the future.

Additional information about a function's definition can be retrieved from its code object; see the description of internal types below.

Instance methods

An instance method object combines a class, a class instance and any callable object (normally a user-defined function).

Special read-only attributes: __self__ is the class instance object, __func__ is the function object; __doc__ is the method's documentation (same as __func__.__doc__ ); __name__ is the method name (same as __func__.__name__ ); __module__ is the name of the module the method was defined in, or None if unavailable.

Methods also support accessing (but not setting) the arbitrary function attributes on the underlying function object.

User-defined method objects may be created when getting an attribute of a class (perhaps via an instance of that class), if that attribute is a user-defined function object or a class method object.

When an instance method object is created by retrieving a user-defined function object from a class via one of its instances, its __self__ attribute is the instance, and the method object is said to be bound. The new method's __func__ attribute is the original function object.

When a user-defined method object is created by retrieving another method object from a class or instance, the behaviour is the same as for a function object, except that the __func__ attribute of the new instance is not the original method object but its __func__ attribute.

When an instance method object is created by retrieving a class method object from a class or instance, its __self__ attribute is the class itself, and its __func__ attribute is the function object underlying the class method.

When an instance method object is called, the underlying function ( __func__ ) is called, inserting the class instance ( __self__ ) in front of the argument list. For instance, when is a class which contains a definition for a function f() , and is an instance of , calling x.f(1) is equivalent to calling C.f(x, 1) .

When an instance method object is derived from a class method object, the "class instance" stored in __self__ will actually be the class itself, so that calling either x.f(1) or C.f(1) is equivalent to calling f(C,1) where is the underlying function.

Note that the transformation from function object to instance method object happens each time the attribute is retrieved from the instance. In some cases, a fruitful optimization is to assign the attribute to a local variable and call that local variable. Also notice that this transformation only happens for user-defined functions; other callable objects (and all non-callable objects) are retrieved without transformation. It is also important to note that user-defined functions which are attributes of a class instance are not converted to bound methods; this only happens when the function is an attribute of the class.

Generator functions

A function or method which uses the yield statement (see section The yield statement ) is called a generator function . Such a function, when called, always returns an iterator object which can be used to execute the body of the function: calling the iterator's iterator.__next__() method will cause the function to execute until it provides a value using the yield statement. When the function executes a return statement or falls off the end, a StopIteration exception is raised and the iterator will have reached the end of the set of values to be returned.

Coroutine functions

A function or method which is defined using async def is called a coroutine function . Such a function, when called, returns a coroutine object. It may contain await expressions, as well as async with and async for statements. See also the Coroutine Objects section.

Asynchronous generator functions

A function or method which is defined using async def and which uses the yield statement is called a asynchronous generator function . Such a function, when called, returns an asynchronous iterator object which can be used in an async for statement to execute the body of the function.

Calling the asynchronous iterator's aiterator.__anext__() method will return an awaitable which when awaited will execute until it provides a value using the yield expression. When the function executes an empty return statement or falls off the end, a StopAsyncIteration exception is raised and the asynchronous iterator will have reached the end of the set of values to be yielded.

Built-in functions

A built-in function object is a wrapper around a C function. Examples of built-in functions are len() and math.sin() ( math is a standard built-in module). The number and type of the arguments are determined by the C function. Special read-only attributes: __doc__ is the function's documentation string, or None if unavailable; __name__ is the function's name; __self__ is set to None (but see the next item); __module__ is the name of the module the function was defined in or None if unavailable.

Built-in methods

This is really a different disguise of a built-in function, this time containing an object passed to the C function as an implicit extra argument. An example of a built-in method is alist.append() , assuming alist is a list object. In this case, the special read-only attribute __self__ is set to the object denoted by alist .

Classes
Classes are callable. These objects normally act as factories for new instances of themselves, but variations are possible for class types that override __new__() . The arguments of the call are passed to __new__() and, in the typical case, to __init__() to initialize the new instance.
Class Instances
Instances of arbitrary classes can be made callable by defining a __call__() method in their class.
Modules

Modules are a basic organizational unit of Python code, and are created by the import system as invoked either by the import statement (see import ), or by calling functions such as importlib.import_module() and built-in __import__() . A module object has a namespace implemented by a dictionary object (this is the dictionary referenced by the __globals__ attribute of functions defined in the module). Attribute references are translated to lookups in this dictionary, e.g., m.x is equivalent to m.__dict__["x"] . A module object does not contain the code object used to initialize the module (since it isn't needed once the initialization is done).

Attribute assignment updates the module's namespace dictionary, e.g., m.x is equivalent to m.__dict__["x"] .

Predefined (writable) attributes: __name__ is the module's name; __doc__ is the module's documentation string, or None if unavailable; __annotations__ (optional) is a dictionary containing variable annotations collected during module body execution; __file__ is the pathname of the file from which the module was loaded, if it was loaded from a file. The __file__ attribute may be missing for certain types of modules, such as C modules that are statically linked into the interpreter; for extension modules loaded dynamically from a shared library, it is the pathname of the shared library file.

Special read-only attribute: __dict__ is the module's namespace as a dictionary object.

CPython implementation detail: Because of the way CPython clears module dictionaries, the module dictionary will be cleared when the module falls out of scope even if the dictionary still has live references. To avoid this, copy the dictionary or keep the module around while using its dictionary directly.
Custom classes

Custom class types are typically created by class definitions (see section Class definitions ). A class has a namespace implemented by a dictionary object. Class attribute references are translated to lookups in this dictionary, e.g., C.x is translated to C.__dict__["x"] (although there are a number of hooks which allow for other means of locating attributes). When the attribute name is not found there, the attribute search continues in the base classes. This search of the base classes uses the C3 method resolution order which behaves correctly even in the presence of 'diamond' inheritance structures where there are multiple inheritance paths leading back to a common ancestor. Additional details on the C3 MRO used by Python can be found in the documentation accompanying the 2.3 release at https://www.python.org/download/releases/2.3/mro/ .

When a class attribute reference (for class , say) would yield a class method object, it is transformed into an instance method object whose __self__ attributes is . When it would yield a static method object, it is transformed into the object wrapped by the static method object. See section Implementing Descriptors for another way in which attributes retrieved from a class may differ from those actually contained in its __dict__ .

Class attribute assignments update the class's dictionary, never the dictionary of a base class.

A class object can be called (see above) to yield a class instance (see below).

Special attributes: __name__ is the class name; __module__ is the module name in which the class was defined; __dict__ is the dictionary containing the class's namespace; __bases__ is a tuple containing the base classes, in the order of their occurrence in the base class list; __doc__ is the class's documentation string, or None if undefined; __annotations__ (optional) is a dictionary containing variable annotations collected during class body execution.

Class instances

A class instance is created by calling a class object (see above). A class instance has a namespace implemented as a dictionary which is the first place in which attribute references are searched. When an attribute is not found there, and the instance's class has an attribute by that name, the search continues with the class attributes. If a class attribute is found that is a user-defined function object, it is transformed into an instance method object whose __self__ attribute is the instance. Static method and class method objects are also transformed; see above under "Classes". See section Implementing Descriptors for another way in which attributes of a class retrieved via its instances may differ from the objects actually stored in the class's __dict__ . If no class attribute is found, and the object's class has a __getattr__() method, that is called to satisfy the lookup.

Attribute assignments and deletions update the instance's dictionary, never a class's dictionary. If the class has a __setattr__() or __delattr__() method, this is called instead of updating the instance dictionary directly.

Class instances can pretend to be numbers, sequences, or mappings if they have methods with certain special names. See section Special method names .

Special attributes: __dict__ is the attribute dictionary; __class__ is the instance's class.

I/O objects (also known as file objects)

A file object represents an open file. Various shortcuts are available to create file objects: the open() built-in function, and also os.popen() , os.fdopen() , and the makefile() method of socket objects (and perhaps by other functions or methods provided by extension modules).

The objects sys.stdin , sys.stdout and sys.stderr are initialized to file objects corresponding to the interpreter's standard input, output and error streams; they are all open in text mode and therefore follow the interface defined by the io.TextIOBase abstract class.

Internal types

A few types used internally by the interpreter are exposed to the user. Their definitions may change with future versions of the interpreter, but they are mentioned here for completeness.

Code objects

Code objects represent byte-compiled executable Python code, or bytecode . The difference between a code object and a function object is that the function object contains an explicit reference to the function's globals (the module in which it was defined), while a code object contains no context; also the default argument values are stored in the function object, not in the code object (because they represent values calculated at run-time). Unlike function objects, code objects are immutable and contain no references (directly or indirectly) to mutable objects.

Special read-only attributes: co_name gives the function name; co_argcount is the number of positional arguments (including arguments with default values); co_nlocals is the number of local variables used by the function (including arguments); co_varnames is a tuple containing the names of the local variables (starting with the argument names); co_cellvars is a tuple containing the names of local variables that are referenced by nested functions; co_freevars is a tuple containing the names of free variables; co_code is a string representing the sequence of bytecode instructions; co_consts is a tuple containing the literals used by the bytecode; co_names is a tuple containing the names used by the bytecode; co_filename is the filename from which the code was compiled; co_firstlineno is the first line number of the function; co_lnotab is a string encoding the mapping from bytecode offsets to line numbers (for details see the source code of the interpreter); co_stacksize is the required stack size (including local variables); co_flags is an integer encoding a number of flags for the interpreter.

The following flag bits are defined for co_flags : bit 0x04 is set if the function uses the *arguments syntax to accept an arbitrary number of positional arguments; bit 0x08 is set if the function uses the **keywords syntax to accept arbitrary keyword arguments; bit 0x20 is set if the function is a generator.

Future feature declarations ( from __future__ import division ) also use bits in co_flags to indicate whether a code object was compiled with a particular feature enabled: bit 0x2000 is set if the function was compiled with future division enabled; bits 0x10 and 0x1000 were used in earlier versions of Python.

Other bits in co_flags are reserved for internal use.

If a code object represents a function, the first item in co_consts is the documentation string of the function, or None if undefined.

Frame objects

Frame objects represent execution frames. They may occur in traceback objects (see below).

Special read-only attributes: f_back is to the previous stack frame (towards the caller), or None if this is the bottom stack frame; f_code is the code object being executed in this frame; f_locals is the dictionary used to look up local variables; f_globals is used for global variables; f_builtins is used for built-in (intrinsic) names; f_lasti gives the precise instruction (this is an index into the bytecode string of the code object).

Special writable attributes: f_trace , if not None , is a function called at the start of each source code line (this is used by the debugger); f_lineno is the current line number of the frame -- writing to this from within a trace function jumps to the given line (only for the bottom-most frame). A debugger can implement a Jump command (aka Set Next Statement) by writing to f_lineno.

Frame objects support one method:

frame. clear ()
This method clears all references to local variables held by the frame. Also, if the frame belonged to a generator, the generator is finalized. This helps break reference cycles involving frame objects (for example when catching an exception and storing its traceback for later use).

RuntimeError is raised if the frame is currently executing.

New in version 3.4.
Traceback objects

Traceback objects represent a stack trace of an exception. A traceback object is created when an exception occurs. When the search for an exception handler unwinds the execution stack, at each unwound level a traceback object is inserted in front of the current traceback. When an exception handler is entered, the stack trace is made available to the program. (See section The try statement .) It is accessible as the third item of the tuple returned by sys.exc_info() . When the program contains no suitable handler, the stack trace is written (nicely formatted) to the standard error stream; if the interpreter is interactive, it is also made available to the user as sys.last_traceback .

Special read-only attributes: tb_next is the next level in the stack trace (towards the frame where the exception occurred), or None if there is no next level; tb_frame points to the execution frame of the current level; tb_lineno gives the line number where the exception occurred; tb_lasti indicates the precise instruction. The line number and last instruction in the traceback may differ from the line number of its frame object if the exception occurred in a try statement with no matching except clause or with a finally clause.

Slice objects

Slice objects are used to represent slices for __getitem__() methods. They are also created by the built-in slice() function.

Special read-only attributes: start is the lower bound; stop is the upper bound; step is the step value; each is None if omitted. These attributes can have any type.

Slice objects support one method:

slice. indices self , length
This method takes a single integer argument length and computes information about the slice that the slice object would describe if applied to a sequence of length items. It returns a tuple of three integers; respectively these are the start and stop indices and the step or stride length of the slice. Missing or out-of-bounds indices are handled in a manner consistent with regular slices.
Static method objects
Static method objects provide a way of defeating the transformation of function objects to method objects described above. A static method object is a wrapper around any other object, usually a user-defined method object. When a static method object is retrieved from a class or a class instance, the object actually returned is the wrapped object, which is not subject to any further transformation. Static method objects are not themselves callable, although the objects they wrap usually are. Static method objects are created by the built-in staticmethod() constructor.
Class method objects
A class method object, like a static method object, is a wrapper around another object that alters the way in which that object is retrieved from classes and class instances. The behaviour of class method objects upon such retrieval is described above, under "User-defined methods". Class method objects are created by the built-in classmethod() constructor.
3.3. Special method names

A class can implement certain operations that are invoked by special syntax (such as arithmetic operations or subscripting and slicing) by defining methods with special names. This is Python's approach to operator overloading , allowing classes to define their own behavior with respect to language operators. For instance, if a class defines a method named __getitem__() , and is an instance of this class, then x[i] is roughly equivalent to type(x).__getitem__(x, i) . Except where mentioned, attempts to execute an operation raise an exception when no appropriate method is defined (typically AttributeError or TypeError ).

Setting a special method to None indicates that the corresponding operation is not available. For example, if a class sets __iter__() to None , the class is not iterable, so calling iter() on its instances will raise a TypeError (without falling back to __getitem__() ). [2]

When implementing a class that emulates any built-in type, it is important that the emulation only be implemented to the degree that it makes sense for the object being modelled. For example, some sequences may work well with retrieval of individual elements, but extracting a slice may not make sense. (One example of this is the NodeList interface in the W3C's Document Object Model.)

3.3.1. Basic customization
object. __new__ cls , ...

Called to create a new instance of class cls . __new__() is a static method (special-cased so you need not declare it as such) that takes the class of which an instance was requested as its first argument. The remaining arguments are those passed to the object constructor expression (the call to the class). The return value of __new__() should be the new object instance (usually an instance of cls ).

Typical implementations create a new instance of the class by invoking the superclass's __new__() method using super().__new__(cls[, ...]) with appropriate arguments and then modifying the newly-created instance as necessary before returning it.

If __new__() returns an instance of cls , then the new instance's __init__() method will be invoked like __init__(self[, ...]) , where self is the new instance and the remaining arguments are the same as were passed to __new__() .

If __new__() does not return an instance of cls , then the new instance's __init__() method will not be invoked.

__new__() is intended mainly to allow subclasses of immutable types (like int, str, or tuple) to customize instance creation. It is also commonly overridden in custom metaclasses in order to customize class creation.

object. __init__ self , ...

Called after the instance has been created (by __new__() ), but before it is returned to the caller. The arguments are those passed to the class constructor expression. If a base class has an __init__() method, the derived class's __init__() method, if any, must explicitly call it to ensure proper initialization of the base class part of the instance; for example: super().__init__([args...]) .

Because __new__() and __init__() work together in constructing objects ( __new__() to create it, and __init__() to customize it), no non- None value may be returned by __init__() ; doing so will cause a TypeError to be raised at runtime.

object. __del__ self

Called when the instance is about to be destroyed. This is also called a destructor. If a base class has a __del__() method, the derived class's __del__() method, if any, must explicitly call it to ensure proper deletion of the base class part of the instance. Note that it is possible (though not recommended!) for the __del__() method to postpone destruction of the instance by creating a new reference to it. It may then be called at a later time when this new reference is deleted. It is not guaranteed that __del__() methods are called for objects that still exist when the interpreter exits.

Note

del doesn't directly call x.__del__() -- the former decrements the reference count for by one, and the latter is only called when 's reference count reaches zero. Some common situations that may prevent the reference count of an object from going to zero include: circular references between objects (e.g., a doubly-linked list or a tree data structure with parent and child pointers); a reference to the object on the stack frame of a function that caught an exception (the traceback stored in sys.exc_info()[2] keeps the stack frame alive); or a reference to the object on the stack frame that raised an unhandled exception in interactive mode (the traceback stored in sys.last_traceback keeps the stack frame alive). The first situation can only be remedied by explicitly breaking the cycles; the second can be resolved by freeing the reference to the traceback object when it is no longer useful, and the third can be resolved by storing None in sys.last_traceback . Circular references which are garbage are detected and cleaned up when the cyclic garbage collector is enabled (it's on by default). Refer to the documentation for the gc module for more information about this topic.

Warning

Due to the precarious circumstances under which __del__() methods are invoked, exceptions that occur during their execution are ignored, and a warning is printed to sys.stderr instead. Also, when __del__() is invoked in response to a module being deleted (e.g., when execution of the program is done), other globals referenced by the __del__() method may already have been deleted or in the process of being torn down (e.g. the import machinery shutting down). For this reason, __del__() methods should do the absolute minimum needed to maintain external invariants. Starting with version 1.5, Python guarantees that globals whose name begins with a single underscore are deleted from their module before other globals are deleted; if no other references to such globals exist, this may help in assuring that imported modules are still available at the time when the __del__() method is called.

object. __repr__ self
Called by the repr() built-in function to compute the "official" string representation of an object. If at all possible, this should look like a valid Python expression that could be used to recreate an object with the same value (given an appropriate environment). If this is not possible, a string of the form <...some useful description...> should be returned. The return value must be a string object. If a class defines __repr__() but not __str__() , then __repr__() is also used when an "informal" string representation of instances of that class is required.

This is typically used for debugging, so it is important that the representation is information-rich and unambiguous.

object. __str__ self
Called by str(object) and the built-in functions format() and print() to compute the "informal" or nicely printable string representation of an object. The return value must be a string object.

This method differs from object.__repr__() in that there is no expectation that __str__() return a valid Python expression: a more convenient or concise representation can be used.

The default implementation defined by the built-in type object calls object.__repr__() .

object. __bytes__ self

Called by bytes to compute a byte-string representation of an object. This should return a bytes object.

object. __format__ self , format_spec
Called by the format() built-in function, and by extension, evaluation of formatted string literals and the str.format() method, to produce a "formatted" string representation of an object. The format_spec argument is a string that contains a description of the formatting options desired. The interpretation of the format_spec argument is up to the type implementing __format__() , however most classes will either delegate formatting to one of the built-in types, or use a similar formatting option syntax.

See Format Specification Mini-Language for a description of the standard formatting syntax.

The return value must be a string object.

Changed in version 3.4: The __format__ method of object itself raises a TypeError if passed any non-empty string.
object. __lt__ self , other
object. __le__ self , other
object. __eq__ self , other
object. __ne__ self , other
object. __gt__ self , other
object. __ge__ self , other

These are the so-called "rich comparison" methods. The correspondence between operator symbols and method names is as follows: x<y calls x.__lt__(y) , x<=y calls x.__le__(y) , x==y calls x.__eq__(y) , x!=y calls x.__ne__(y) , x>y calls x.__gt__(y) , and x>=y calls x.__ge__(y) .

A rich comparison method may return the singleton NotImplemented if it does not implement the operation for a given pair of arguments. By convention, False and True are returned for a successful comparison. However, these methods can return any value, so if the comparison operator is used in a Boolean context (e.g., in the condition of an if statement), Python will call bool() on the value to determine if the result is true or false.

By default, __ne__() delegates to __eq__() and inverts the result unless it is NotImplemented . There are no other implied relationships among the comparison operators, for example, the truth of (x<y or x==y) does not imply x<=y . To automatically generate ordering operations from a single root operation, see functools.total_ordering() .

See the paragraph on __hash__() for some important notes on creating hashable objects which support custom comparison operations and are usable as dictionary keys.

There are no swapped-argument versions of these methods (to be used when the left argument does not support the operation but the right argument does); rather, __lt__() and __gt__() are each other's reflection, __le__() and __ge__() are each other's reflection, and __eq__() and __ne__() are their own reflection. If the operands are of different types, and right operand's type is a direct or indirect subclass of the left operand's type, the reflected method of the right operand has priority, otherwise the left operand's method has priority. Virtual subclassing is not considered.

object. __hash__ self

Called by built-in function hash() and for operations on members of hashed collections including set , frozenset , and dict . __hash__() should return an integer. The only required property is that objects which compare equal have the same hash value; it is advised to mix together the hash values of the components of the object that also play a part in comparison of objects by packing them into a tuple and hashing the tuple. Example:

def __hash__(self):
    return hash((self.name, self.nick, self.color))

Note

hash() truncates the value returned from an object's custom __hash__() method to the size of a Py_ssize_t . This is typically 8 bytes on 64-bit builds and 4 bytes on 32-bit builds. If an object's __hash__() must interoperate on builds of different bit sizes, be sure to check the width on all supported builds. An easy way to do this is with python -c "import sys; print(sys.hash_info.width)" .

If a class does not define an __eq__() method it should not define a __hash__() operation either; if it defines __eq__() but not __hash__() , its instances will not be usable as items in hashable collections. If a class defines mutable objects and implements an __eq__() method, it should not implement __hash__() , since the implementation of hashable collections requires that a key's hash value is immutable (if the object's hash value changes, it will be in the wrong hash bucket).

User-defined classes have __eq__() and __hash__() methods by default; with them, all objects compare unequal (except with themselves) and x.__hash__() returns an appropriate value such that == implies both that is and hash(x) == hash(y) .

A class that overrides __eq__() and does not define __hash__() will have its __hash__() implicitly set to None . When the __hash__() method of a class is None , instances of the class will raise an appropriate TypeError when a program attempts to retrieve their hash value, and will also be correctly identified as unhashable when checking isinstance(obj, collections.Hashable) .

If a class that overrides __eq__() needs to retain the implementation of __hash__() from a parent class, the interpreter must be told this explicitly by setting __hash__ <ParentClass>.__hash__ .

If a class that does not override __eq__() wishes to suppress hash support, it should include __hash__ None in the class definition. A class which defines its own __hash__() that explicitly raises a TypeError would be incorrectly identified as hashable by an isinstance(obj, collections.Hashable) call.

Note

By default, the __hash__() values of str, bytes and datetime objects are "salted" with an unpredictable random value. Although they remain constant within an individual Python process, they are not predictable between repeated invocations of Python.

This is intended to provide protection against a denial-of-service caused by carefully-chosen inputs that exploit the worst case performance of a dict insertion, O(n^2) complexity. See http://www.ocert.org/advisories/ocert-2011-003.html for details.

Changing hash values affects the iteration order of dicts, sets and other mappings. Python has never made guarantees about this ordering (and it typically varies between 32-bit and 64-bit builds).

See also PYTHONHASHSEED . Changed in version 3.3: Hash randomization is enabled by default.

object. __bool__ self

Called to implement truth value testing and the built-in operation bool() ; should return False or True . When this method is not defined, __len__() is called, if it is defined, and the object is considered true if its result is nonzero. If a class defines neither __len__() nor __bool__() , all its instances are considered true.

3.3.2. Customizing attribute access

The following methods can be defined to customize the meaning of attribute access (use of, assignment to, or deletion of x.name ) for class instances.

object. __getattr__ self , name
Called when an attribute lookup has not found the attribute in the usual places (i.e. it is not an instance attribute nor is it found in the class tree for self ). name is the attribute name. This method should return the (computed) attribute value or raise an AttributeError exception.

Note that if the attribute is found through the normal mechanism, __getattr__() is not called. (This is an intentional asymmetry between __getattr__() and __setattr__() .) This is done both for efficiency reasons and because otherwise __getattr__() would have no way to access other attributes of the instance. Note that at least for instance variables, you can fake total control by not inserting any values in the instance attribute dictionary (but instead inserting them in another object). See the __getattribute__() method below for a way to actually get total control over attribute access.

object. __getattribute__ self , name
Called unconditionally to implement attribute accesses for instances of the class. If the class also defines __getattr__() , the latter will not be called unless __getattribute__() either calls it explicitly or raises an AttributeError . This method should return the (computed) attribute value or raise an AttributeError exception. In order to avoid infinite recursion in this method, its implementation should always call the base class method with the same name to access any attributes it needs, for example, object.__getattribute__(self, name) .

Note

This method may still be bypassed when looking up special methods as the result of implicit invocation via language syntax or built-in functions. See Special method lookup .

object. __setattr__ self , name , value
Called when an attribute assignment is attempted. This is called instead of the normal mechanism (i.e. store the value in the instance dictionary). name is the attribute name, value is the value to be assigned to it.

If __setattr__() wants to assign to an instance attribute, it should call the base class method with the same name, for example, object.__setattr__(self, name, value) .

object. __delattr__ self , name
Like __setattr__() but for attribute deletion instead of assignment. This should only be implemented if del obj.name is meaningful for the object.
object. __dir__ self
Called when dir() is called on the object. A sequence must be returned. dir() converts the returned sequence to a list and sorts it.
3.3.2.1. Implementing Descriptors

The following methods only apply when an instance of the class containing the method (a so-called descriptor class) appears in an owner class (the descriptor must be in either the owner's class dictionary or in the class dictionary for one of its parents). In the examples below, "the attribute" refers to the attribute whose name is the key of the property in the owner class' __dict__ .

object. __get__ self , instance , owner
Called to get the attribute of the owner class (class attribute access) or of an instance of that class (instance attribute access). owner is always the owner class, while instance is the instance that the attribute was accessed through, or None when the attribute is accessed through the owner . This method should return the (computed) attribute value or raise an AttributeError exception.
object. __set__ self , instance , value
Called to set the attribute on an instance instance of the owner class to a new value, value .
object. __delete__ self , instance
Called to delete the attribute on an instance instance of the owner class.
object. __set_name__ self , owner , name
Called at the time the owning class owner is created. The descriptor has been assigned to name . New in version 3.6.

The attribute __objclass__ is interpreted by the inspect module as specifying the class where this object was defined (setting this appropriately can assist in runtime introspection of dynamic class attributes). For callables, it may indicate that an instance of the given type (or a subclass) is expected or required as the first positional argument (for example, CPython sets this attribute for unbound methods that are implemented in C). 3.3.2.2. Invoking Descriptors

In general, a descriptor is an object attribute with "binding behavior", one whose attribute access has been overridden by methods in the descriptor protocol: __get__() , __set__() , and __delete__() . If any of those methods are defined for an object, it is said to be a descriptor.

The default behavior for attribute access is to get, set, or delete the attribute from an object's dictionary. For instance, a.x has a lookup chain starting with a.__dict__['x'] , then type(a).__dict__['x'] , and continuing through the base classes of type(a) excluding metaclasses.

However, if the looked-up value is an object defining one of the descriptor methods, then Python may override the default behavior and invoke the descriptor method instead. Where this occurs in the precedence chain depends on which descriptor methods were defined and how they were called.

The starting point for descriptor invocation is a binding, a.x . How the arguments are assembled depends on :

Direct Call
The simplest and least common call is when user code directly invokes a descriptor method: x.__get__(a) .
Instance Binding
If binding to an object instance, a.x is transformed into the call: type(a).__dict__['x'].__get__(a, type(a)) .
Class Binding
If binding to a class, A.x is transformed into the call: A.__dict__['x'].__get__(None, A) .
Super Binding
If is an instance of super , then the binding super(B, obj).m() searches obj.__class__.__mro__ for the base class immediately preceding and then invokes the descriptor with the call: A.__dict__['m'].__get__(obj, obj.__class__) .

For instance bindings, the precedence of descriptor invocation depends on the which descriptor methods are defined. A descriptor can define any combination of __get__() , __set__() and __delete__() . If it does not define __get__() , then accessing the attribute will return the descriptor object itself unless there is a value in the object's instance dictionary. If the descriptor defines __set__() and/or __delete__() , it is a data descriptor; if it defines neither, it is a non-data descriptor. Normally, data descriptors define both __get__() and __set__() , while non-data descriptors have just the __get__() method. Data descriptors with __set__() and __get__() defined always override a redefinition in an instance dictionary. In contrast, non-data descriptors can be overridden by instances.

Python methods (including staticmethod() and classmethod() ) are implemented as non-data descriptors. Accordingly, instances can redefine and override methods. This allows individual instances to acquire behaviors that differ from other instances of the same class.

The property() function is implemented as a data descriptor. Accordingly, instances cannot override the behavior of a property. 3.3.2.3. __slots__

By default, instances of classes have a dictionary for attribute storage. This wastes space for objects having very few instance variables. The space consumption can become acute when creating large numbers of instances.

The default can be overridden by defining __slots__ in a class definition. The __slots__ declaration takes a sequence of instance variables and reserves just enough space in each instance to hold a value for each variable. Space is saved because __dict__ is not created for each instance.

object. __slots__
This class variable can be assigned a string, iterable, or sequence of strings with variable names used by instances. __slots__ reserves space for the declared variables and prevents the automatic creation of __dict__ and __weakref__ for each instance.
3.3.2.3.1. Notes on using __slots__
3.3.3. Customizing class creation

Whenever a class inherits from another class, __init_subclass__ is called on that class. This way, it is possible to write classes which change the behavior of subclasses. This is closely related to class decorators, but where class decorators only affect the specific class they're applied to, __init_subclass__ solely applies to future subclasses of the class defining the method.

classmethod object. __init_subclass__ cls
This method is called whenever the containing class is subclassed. cls is then the new subclass. If defined as a normal instance method, this method is implicitly converted to a class method.

Keyword arguments which are given to a new class are passed to the parent's class __init_subclass__ . For compatibility with other classes using __init_subclass__ , one should take out the needed keyword arguments and pass the others over to the base class, as in:

class Philosopher:
    def __init_subclass__(cls, default_name, **kwargs):
        super().__init_subclass__(**kwargs)
        cls.default_name = default_name

class AustralianPhilosopher(Philosopher, default_name="Bruce"):
    pass

The default implementation object.__init_subclass__ does nothing, but raises an error if it is called with any arguments.

Note

The metaclass hint metaclass is consumed by the rest of the type machinery, and is never passed to __init_subclass__ implementations. The actual metaclass (rather than the explicit hint) can be accessed as type(cls) . New in version 3.6.

3.3.3.1. Metaclasses

By default, classes are constructed using type() . The class body is executed in a new namespace and the class name is bound locally to the result of type(name, bases, namespace) .

The class creation process can be customized by passing the metaclass keyword argument in the class definition line, or by inheriting from an existing class that included such an argument. In the following example, both MyClass and MySubclass are instances of Meta :

class Meta(type):
    pass

class MyClass(metaclass=Meta):
    pass

class MySubclass(MyClass):
    pass

Any other keyword arguments that are specified in the class definition are passed through to all metaclass operations described below.

When a class definition is executed, the following steps occur:

3.3.3.2. Determining the appropriate metaclass

The appropriate metaclass for a class definition is determined as follows:

The most derived metaclass is selected from the explicitly specified metaclass (if any) and the metaclasses (i.e. type(cls) ) of all specified base classes. The most derived metaclass is one which is a subtype of all of these candidate metaclasses. If none of the candidate metaclasses meets that criterion, then the class definition will fail with TypeError . 3.3.3.3. Preparing the class namespace

Once the appropriate metaclass has been identified, then the class namespace is prepared. If the metaclass has a __prepare__ attribute, it is called as namespace metaclass.__prepare__(name, bases, **kwds) (where the additional keyword arguments, if any, come from the class definition).

If the metaclass has no __prepare__ attribute, then the class namespace is initialised as an empty ordered mapping.

See also

PEP 3115 - Metaclasses in Python 3000
Introduced the __prepare__ namespace hook
3.3.3.4. Executing the class body

The class body is executed (approximately) as exec(body, globals(), namespace) . The key difference from a normal call to exec() is that lexical scoping allows the class body (including any methods) to reference names from the current and outer scopes when the class definition occurs inside a function.

However, even when the class definition occurs inside the function, methods defined inside the class still cannot see names defined at the class scope. Class variables must be accessed through the first parameter of instance or class methods, or through the implicit lexically scoped __class__ reference described in the next section. 3.3.3.5. Creating the class object

Once the class namespace has been populated by executing the class body, the class object is created by calling metaclass(name, bases, namespace, **kwds) (the additional keywords passed here are the same as those passed to __prepare__ ).

This class object is the one that will be referenced by the zero-argument form of super() . __class__ is an implicit closure reference created by the compiler if any methods in a class body refer to either __class__ or super . This allows the zero argument form of super() to correctly identify the class being defined based on lexical scoping, while the class or instance that was used to make the current call is identified based on the first argument passed to the method.

CPython implementation detail: In CPython 3.6 and later, the __class__ cell is passed to the metaclass as a __classcell__ entry in the class namespace. If present, this must be propagated up to the type.__new__ call in order for the class to be initialised correctly. Failing to do so will result in a DeprecationWarning in Python 3.6, and a RuntimeWarning in the future.

When using the default metaclass type , or any metaclass that ultimately calls type.__new__ , the following additional customisation steps are invoked after creating the class object:

After the class object is created, it is passed to the class decorators included in the class definition (if any) and the resulting object is bound in the local namespace as the defined class.

When a new class is created by type.__new__ , the object provided as the namespace parameter is copied to a new ordered mapping and the original object is discarded. The new copy is wrapped in a read-only proxy, which becomes the __dict__ attribute of the class object.

See also

PEP 3135 - New super
Describes the implicit __class__ closure reference
3.3.3.6. Metaclass example

The potential uses for metaclasses are boundless. Some ideas that have been explored include enum, logging, interface checking, automatic delegation, automatic property creation, proxies, frameworks, and automatic resource locking/synchronization.

Here is an example of a metaclass that uses an collections.OrderedDict to remember the order that class variables are defined:

class OrderedClass(type):

    @classmethod
    def __prepare__(metacls, name, bases, **kwds):
        return collections.OrderedDict()

    def __new__(cls, name, bases, namespace, **kwds):
        result = type.__new__(cls, name, bases, dict(namespace))
        result.members = tuple(namespace)
        return result

class A(metaclass=OrderedClass):
    def one(self): pass
    def two(self): pass
    def three(self): pass
    def four(self): pass

>>> A.members
('__module__', 'one', 'two', 'three', 'four')

When the class definition for A gets executed, the process begins with calling the metaclass's __prepare__() method which returns an empty collections.OrderedDict . That mapping records the methods and attributes of A as they are defined within the body of the class statement. Once those definitions are executed, the ordered dictionary is fully populated and the metaclass's __new__() method gets invoked. That method builds the new type and it saves the ordered dictionary keys in an attribute called members . 3.3.4. Customizing instance and subclass checks

The following methods are used to override the default behavior of the isinstance() and issubclass() built-in functions.

In particular, the metaclass abc.ABCMeta implements these methods in order to allow the addition of Abstract Base Classes (ABCs) as "virtual base classes" to any class or type (including built-in types), including other ABCs.

class. __instancecheck__ self , instance
Return true if instance should be considered a (direct or indirect) instance of class . If defined, called to implement isinstance(instance, class) .
class. __subclasscheck__ self , subclass
Return true if subclass should be considered a (direct or indirect) subclass of class . If defined, called to implement issubclass(subclass, class) .

Note that these methods are looked up on the type (metaclass) of a class. They cannot be defined as class methods in the actual class. This is consistent with the lookup of special methods that are called on instances, only in this case the instance is itself a class.

See also

PEP 3119 - Introducing Abstract Base Classes
Includes the specification for customizing isinstance() and issubclass() behavior through __instancecheck__() and __subclasscheck__() , with motivation for this functionality in the context of adding Abstract Base Classes (see the abc module) to the language.
3.3.5. Emulating callable objects
object. __call__ self , args...

Called when the instance is "called" as a function; if this method is defined, x(arg1, arg2, ...) is a shorthand for x.__call__(arg1, arg2, ...) .

3.3.6. Emulating container types

The following methods can be defined to implement container objects. Containers usually are sequences (such as lists or tuples) or mappings (like dictionaries), but can represent other containers as well. The first set of methods is used either to emulate a sequence or to emulate a mapping; the difference is that for a sequence, the allowable keys should be the integers k for which <= < where N is the length of the sequence, or slice objects, which define a range of items. It is also recommended that mappings provide the methods keys() , values() , items() , get() , clear() , setdefault() , pop() , popitem() , copy() , and update() behaving similar to those for Python's standard dictionary objects. The collections module provides a MutableMapping abstract base class to help create those methods from a base set of __getitem__() , __setitem__() , __delitem__() , and keys() . Mutable sequences should provide methods append() , count() , index() , extend() , insert() , pop() , remove() , reverse() and sort() , like Python standard list objects. Finally, sequence types should implement addition (meaning concatenation) and multiplication (meaning repetition) by defining the methods __add__() , __radd__() , __iadd__() , __mul__() , __rmul__() and __imul__() described below; they should not define other numerical operators. It is recommended that both mappings and sequences implement the __contains__() method to allow efficient use of the in operator; for mappings, in should search the mapping's keys; for sequences, it should search through the values. It is further recommended that both mappings and sequences implement the __iter__() method to allow efficient iteration through the container; for mappings, __iter__() should be the same as keys() ; for sequences, it should iterate through the values.

object. __len__ self

Called to implement the built-in function len() . Should return the length of the object, an integer >= 0. Also, an object that doesn't define a __bool__() method and whose __len__() method returns zero is considered to be false in a Boolean context.

CPython implementation detail: In CPython, the length is required to be at most sys.maxsize . If the length is larger than sys.maxsize some features (such as len() ) may raise OverflowError . To prevent raising OverflowError by truth value testing, an object must define a __bool__() method.
object. __length_hint__ self
Called to implement operator.length_hint() . Should return an estimated length for the object (which may be greater or less than the actual length). The length must be an integer >= 0. This method is purely an optimization and is never required for correctness. New in version 3.4.

Note

Slicing is done exclusively with the following three methods. A call like

a[1:2] = b

is translated to

a[slice(1, 2, None)] = b

and so forth. Missing slice items are always filled in with None .

object. __getitem__ self , key

Called to implement evaluation of self[key] . For sequence types, the accepted keys should be integers and slice objects. Note that the special interpretation of negative indexes (if the class wishes to emulate a sequence type) is up to the __getitem__() method. If key is of an inappropriate type, TypeError may be raised; if of a value outside the set of indexes for the sequence (after any special interpretation of negative values), IndexError should be raised. For mapping types, if key is missing (not in the container), KeyError should be raised.

Note

for loops expect that an IndexError will be raised for illegal indexes to allow proper detection of the end of the sequence.

object. __missing__ self , key
Called by dict . __getitem__() to implement self[key] for dict subclasses when key is not in the dictionary.
object. __setitem__ self , key , value
Called to implement assignment to self[key] . Same note as for __getitem__() . This should only be implemented for mappings if the objects support changes to the values for keys, or if new keys can be added, or for sequences if elements can be replaced. The same exceptions should be raised for improper key values as for the __getitem__() method.
object. __delitem__ self , key
Called to implement deletion of self[key] . Same note as for __getitem__() . This should only be implemented for mappings if the objects support removal of keys, or for sequences if elements can be removed from the sequence. The same exceptions should be raised for improper key values as for the __getitem__() method.
object. __iter__ self
This method is called when an iterator is required for a container. This method should return a new iterator object that can iterate over all the objects in the container. For mappings, it should iterate over the keys of the container.

Iterator objects also need to implement this method; they are required to return themselves. For more information on iterator objects, see Iterator Types .

object. __reversed__ self
Called (if present) by the reversed() built-in to implement reverse iteration. It should return a new iterator object that iterates over all the objects in the container in reverse order.

If the __reversed__() method is not provided, the reversed() built-in will fall back to using the sequence protocol ( __len__() and __getitem__() ). Objects that support the sequence protocol should only provide __reversed__() if they can provide an implementation that is more efficient than the one provided by reversed() .

The membership test operators ( in and not in ) are normally implemented as an iteration through a sequence. However, container objects can supply the following special method with a more efficient implementation, which also does not require the object be a sequence.

object. __contains__ self , item
Called to implement membership test operators. Should return true if item is in self , false otherwise. For mapping objects, this should consider the keys of the mapping rather than the values or the key-item pairs.

For objects that don't define __contains__() , the membership test first tries iteration via __iter__() , then the old sequence iteration protocol via __getitem__() , see this section in the language reference .

3.3.7. Emulating numeric types

The following methods can be defined to emulate numeric objects. Methods corresponding to operations that are not supported by the particular kind of number implemented (e.g., bitwise operations for non-integral numbers) should be left undefined.

object. __add__ self , other
object. __sub__ self , other
object. __mul__ self , other
object. __matmul__ self , other
object. __truediv__ self , other
object. __floordiv__ self , other
object. __mod__ self , other
object. __divmod__ self , other
object. __pow__ self , other modulo
object. __lshift__ self , other
object. __rshift__ self , other
object. __and__ self , other
object. __xor__ self , other
object. __or__ self , other

These methods are called to implement the binary arithmetic operations ( , , , , , // , , divmod() , pow() , ** , << , >> , & , , ). For instance, to evaluate the expression , where x is an instance of a class that has an __add__() method, x.__add__(y) is called. The __divmod__() method should be the equivalent to using __floordiv__() and __mod__() ; it should not be related to __truediv__() . Note that __pow__() should be defined to accept an optional third argument if the ternary version of the built-in pow() function is to be supported.

If one of those methods does not support the operation with the supplied arguments, it should return NotImplemented .

object. __radd__ self , other
object. __rsub__ self , other
object. __rmul__ self , other
object. __rmatmul__ self , other
object. __rtruediv__ self , other
object. __rfloordiv__ self , other
object. __rmod__ self , other
object. __rdivmod__ self , other
object. __rpow__ self , other
object. __rlshift__ self , other
object. __rrshift__ self , other
object. __rand__ self , other
object. __rxor__ self , other
object. __ror__ self , other

These methods are called to implement the binary arithmetic operations ( , , , , , // , , divmod() , pow() , ** , << , >> , & , , ) with reflected (swapped) operands. These functions are only called if the left operand does not support the corresponding operation [3] and the operands are of different types. [4] For instance, to evaluate the expression , where y is an instance of a class that has an __rsub__() method, y.__rsub__(x) is called if x.__sub__(y) returns NotImplemented .

Note that ternary pow() will not try calling __rpow__() (the coercion rules would become too complicated).

Note

If the right operand's type is a subclass of the left operand's type and that subclass provides the reflected method for the operation, this method will be called before the left operand's non-reflected method. This behavior allows subclasses to override their ancestors' operations.

object. __iadd__ self , other
object. __isub__ self , other
object. __imul__ self , other
object. __imatmul__ self , other
object. __itruediv__ self , other
object. __ifloordiv__ self , other
object. __imod__ self , other
object. __ipow__ self , other modulo
object. __ilshift__ self , other
object. __irshift__ self , other
object. __iand__ self , other
object. __ixor__ self , other
object. __ior__ self , other
These methods are called to implement the augmented arithmetic assignments ( += , -= , *= , @= , /= , //= , %= , **= , <<= , >>= , &= , ^= , |= ). These methods should attempt to do the operation in-place (modifying self ) and return the result (which could be, but does not have to be, self ). If a specific method is not defined, the augmented assignment falls back to the normal methods. For instance, if x is an instance of a class with an __iadd__() method, += is equivalent to x.__iadd__(y) . Otherwise, x.__add__(y) and y.__radd__(x) are considered, as with the evaluation of . In certain situations, augmented assignment can result in unexpected errors (see Why does a_tuple[i] += ['item'] raise an exception when the addition works? ), but this behavior is in fact part of the data model.
object. __neg__ self
object. __pos__ self
object. __abs__ self
object. __invert__ self

Called to implement the unary arithmetic operations ( , , abs() and ).

object. __complex__ self
object. __int__ self
object. __float__ self
object. __round__ self , n

Called to implement the built-in functions complex() , int() , float() and round() . Should return a value of the appropriate type.

object. __index__ self
Called to implement operator.index() , and whenever Python needs to losslessly convert the numeric object to an integer object (such as in slicing, or in the built-in bin() , hex() and oct() functions). Presence of this method indicates that the numeric object is an integer type. Must return an integer.

Note

In order to have a coherent integer type class, when __index__() is defined __int__() should also be defined, and both should return the same value.

3.3.8. With Statement Context Managers

A context manager is an object that defines the runtime context to be established when executing a with statement. The context manager handles the entry into, and the exit from, the desired runtime context for the execution of the block of code. Context managers are normally invoked using the with statement (described in section The with statement ), but can also be used by directly invoking their methods.

Typical uses of context managers include saving and restoring various kinds of global state, locking and unlocking resources, closing opened files, etc.

For more information on context managers, see Context Manager Types .

object. __enter__ self
Enter the runtime context related to this object. The with statement will bind this method's return value to the target(s) specified in the as clause of the statement, if any.
object. __exit__ self , exc_type , exc_value , traceback
Exit the runtime context related to this object. The parameters describe the exception that caused the context to be exited. If the context was exited without an exception, all three arguments will be None .

If an exception is supplied, and the method wishes to suppress the exception (i.e., prevent it from being propagated), it should return a true value. Otherwise, the exception will be processed normally upon exit from this method.

Note that __exit__() methods should not reraise the passed-in exception; this is the caller's responsibility.

See also

PEP 343 - The "with" statement
The specification, background, and examples for the Python with statement.
3.3.9. Special method lookup

For custom classes, implicit invocations of special methods are only guaranteed to work correctly if defined on an object's type, not in the object's instance dictionary. That behaviour is the reason why the following code raises an exception:

>>>
>>> class C:
...     pass
...
>>> c = C()
>>> c.__len__ = lambda: 5
>>> len(c)
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
TypeError: object of type 'C' has no len()

The rationale behind this behaviour lies with a number of special methods such as __hash__() and __repr__() that are implemented by all objects, including type objects. If the implicit lookup of these methods used the conventional lookup process, they would fail when invoked on the type object itself:

>>>
>>> 1 .__hash__() == hash(1)
True
>>> int.__hash__() == hash(int)
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
TypeError: descriptor '__hash__' of 'int' object needs an argument

Incorrectly attempting to invoke an unbound method of a class in this way is sometimes referred to as 'metaclass confusion', and is avoided by bypassing the instance when looking up special methods:

>>>
>>> type(1).__hash__(1) == hash(1)
True
>>> type(int).__hash__(int) == hash(int)
True

In addition to bypassing any instance attributes in the interest of correctness, implicit special method lookup generally also bypasses the __getattribute__() method even of the object's metaclass:

>>>
>>> class Meta(type):
...     def __getattribute__(*args):
...         print("Metaclass getattribute invoked")
...         return type.__getattribute__(*args)
...
>>> class C(object, metaclass=Meta):
...     def __len__(self):
...         return 10
...     def __getattribute__(*args):
...         print("Class getattribute invoked")
...         return object.__getattribute__(*args)
...
>>> c = C()
>>> c.__len__()                 # Explicit lookup via instance
Class getattribute invoked
10
>>> type(c).__len__(c)          # Explicit lookup via type
Metaclass getattribute invoked
10
>>> len(c)                      # Implicit lookup
10

Bypassing the __getattribute__() machinery in this fashion provides significant scope for speed optimisations within the interpreter, at the cost of some flexibility in the handling of special methods (the special method must be set on the class object itself in order to be consistently invoked by the interpreter). 3.4. Coroutines 3.4.1. Awaitable Objects

An awaitable object generally implements an __await__() method. Coroutine objects returned from async def functions are awaitable.

Note

The generator iterator objects returned from generators decorated with types.coroutine() or asyncio.coroutine() are also awaitable, but they do not implement __await__() .

object. __await__ self
Must return an iterator . Should be used to implement awaitable objects. For instance, asyncio.Future implements this method to be compatible with the await expression.
New in version 3.5.

See also

PEP 492 for additional information about awaitable objects. 3.4.2. Coroutine Objects

Coroutine objects are awaitable objects. A coroutine's execution can be controlled by calling __await__() and iterating over the result. When the coroutine has finished executing and returns, the iterator raises StopIteration , and the exception's value attribute holds the return value. If the coroutine raises an exception, it is propagated by the iterator. Coroutines should not directly raise unhandled StopIteration exceptions.

Coroutines also have the methods listed below, which are analogous to those of generators (see Generator-iterator methods ). However, unlike generators, coroutines do not directly support iteration.

Changed in version 3.5.2: It is a RuntimeError to await on a coroutine more than once.
coroutine. send value
Starts or resumes execution of the coroutine. If value is None , this is equivalent to advancing the iterator returned by __await__() . If value is not None , this method delegates to the send() method of the iterator that caused the coroutine to suspend. The result (return value, StopIteration , or other exception) is the same as when iterating over the __await__() return value, described above.
coroutine. throw type , value traceback ]]
Raises the specified exception in the coroutine. This method delegates to the throw() method of the iterator that caused the coroutine to suspend, if it has such a method. Otherwise, the exception is raised at the suspension point. The result (return value, StopIteration , or other exception) is the same as when iterating over the __await__() return value, described above. If the exception is not caught in the coroutine, it propagates back to the caller.
coroutine. close ()
Causes the coroutine to clean itself up and exit. If the coroutine is suspended, this method first delegates to the close() method of the iterator that caused the coroutine to suspend, if it has such a method. Then it raises GeneratorExit at the suspension point, causing the coroutine to immediately clean itself up. Finally, the coroutine is marked as having finished executing, even if it was never started.

Coroutine objects are automatically closed using the above process when they are about to be destroyed.

3.4.3. Asynchronous Iterators

An asynchronous iterable is able to call asynchronous code in its __aiter__ implementation, and an asynchronous iterator can call asynchronous code in its __anext__ method.

Asynchronous iterators can be used in an async for statement.

object. __aiter__ self
Must return an asynchronous iterator object.
object. __anext__ self
Must return an awaitable resulting in a next value of the iterator. Should raise a StopAsyncIteration error when the iteration is over.

An example of an asynchronous iterable object:

class Reader:
    async def readline(self):
        ...

    def __aiter__(self):
        return self

    async def __anext__(self):
        val = await self.readline()
        if val == b'':
            raise StopAsyncIteration
        return val
New in version 3.5.

Note

Changed in version 3.5.2: Starting with CPython 3.5.2, __aiter__ can directly return asynchronous iterators . Returning an awaitable object will result in a PendingDeprecationWarning .

The recommended way of writing backwards compatible code in CPython 3.5.x is to continue returning awaitables from __aiter__ . If you want to avoid the PendingDeprecationWarning and keep the code backwards compatible, the following decorator can be used:

import functools
import sys

if sys.version_info < (3, 5, 2):
    def aiter_compat(func):
        @functools.wraps(func)
        async def wrapper(self):
            return func(self)
        return wrapper
else:
    def aiter_compat(func):
        return func

Example:

class AsyncIterator:

    @aiter_compat
    def __aiter__(self):
        return self

    async def __anext__(self):
        ...

Starting with CPython 3.6, the PendingDeprecationWarning will be replaced with the DeprecationWarning . In CPython 3.7, returning an awaitable from __aiter__ will result in a RuntimeError . 3.4.4. Asynchronous Context Managers

An asynchronous context manager is a context manager that is able to suspend execution in its __aenter__ and __aexit__ methods.

Asynchronous context managers can be used in an async with statement.

object. __aenter__ self
This method is semantically similar to the __enter__() , with only difference that it must return an awaitable .
object. __aexit__ self , exc_type , exc_value , traceback
This method is semantically similar to the __exit__() , with only difference that it must return an awaitable .

An example of an asynchronous context manager class:

class AsyncContextManager:
    async def __aenter__(self):
        await log('entering context')

    async def __aexit__(self, exc_type, exc, tb):
        await log('exiting context')
New in version 3.5.

Footnotes

[1] It is possible in some cases to change an object's type, under certain controlled conditions. It generally isn't a good idea though, since it can lead to some very strange behaviour if it is handled incorrectly.
[2] The __hash__() , __iter__() , __reversed__() , and __contains__() methods have special handling for this; others will still raise a TypeError , but may do so by relying on the behavior that None is not callable.
[3] "Does not support" here means that the class has no such method, or the method returns NotImplemented . Do not set the method to None if you want to force fallback to the right operand's reflected method -- that will instead have the opposite effect of explicitly blocking such fallback.
[4] For operands of the same type, it is assumed that if the non-reflected method (such as __add__() ) fails the operation is not supported, which is why the reflected method is not called.

[Dec 07, 2017] Variable's memory size in Python - Stack Overflow

Dec 07, 2017 | stackoverflow.com

casevh ,Jan 17, 2013 at 5:03

Regarding the internal structure of a Python long, check sys.int_info (or sys.long_info for Python 2.7).
>>> import sys
>>> sys.int_info
sys.int_info(bits_per_digit=30, sizeof_digit=4)

Python either stores 30 bits into 4 bytes (most 64-bit systems) or 15 bits into 2 bytes (most 32-bit systems). Comparing the actual memory usage with calculated values, I get

>>> import math, sys
>>> a=0
>>> sys.getsizeof(a)
24
>>> a=2**100
>>> sys.getsizeof(a)
40
>>> a=2**1000
>>> sys.getsizeof(a)
160
>>> 24+4*math.ceil(100/30)
40
>>> 24+4*math.ceil(1000/30)
160

There are 24 bytes of overhead for 0 since no bits are stored. The memory requirements for larger values matches the calculated values.

If your numbers are so large that you are concerned about the 6.25% unused bits, you should probably look at the gmpy2 library. The internal representation uses all available bits and computations are significantly faster for large values (say, greater than 100 digits).

[Dec 07, 2017] Variables and scope -- Object-Oriented Programming in Python 1 documentation

Notable quotes:
"... class attributes ..."
"... instance attributes ..."
"... alter the existing value ..."
"... implicit conversion ..."
Dec 07, 2017 | python-textbok.readthedocs.io

Variables and scope Variables

Recall that a variable is a label for a location in memory. It can be used to hold a value. In statically typed languages, variables have predetermined types, and a variable can only be used to hold values of that type. In Python, we may reuse the same variable to store values of any type.

A variable is similar to the memory functionality found in most calculators, in that it holds one value which can be retrieved many times, and that storing a new value erases the old. A variable differs from a calculator's memory in that one can have many variables storing different values, and that each variable is referred to by name.

Defining variables

To define a new variable in Python, we simply assign a value to a label. For example, this is how we create a variable called count , which contains an integer value of zero:

count = 0

This is exactly the same syntax as assigning a new value to an existing variable called count . Later in this chapter we will discuss under what circumstances this statement will cause a new variable to be created.

If we try to access the value of a variable which hasn't been defined anywhere yet, the interpreter will exit with a name error.

We can define several variables in one line, but this is usually considered bad style:

# Define three variables at once:
count, result, total = 0, 0, 0

# This is equivalent to:
count = 0
result = 0
total = 0

In keeping with good programming style, we should make use of meaningful names for variables. Variable scope and lifetime

Not all variables are accessible from all parts of our program, and not all variables exist for the same amount of time. Where a variable is accessible and how long it exists depend on how it is defined. We call the part of a program where a variable is accessible its scope , and the duration for which the variable exists its lifetime .

A variable which is defined in the main body of a file is called a global variable. It will be visible throughout the file, and also inside any file which imports that file. Global variables can have unintended consequences because of their wide-ranging effects – that is why we should almost never use them. Only objects which are intended to be used globally, like functions and classes, should be put in the global namespace.

A variable which is defined inside a function is local to that function. It is accessible from the point at which it is defined until the end of the function, and exists for as long as the function is executing. The parameter names in the function definition behave like local variables, but they contain the values that we pass into the function when we call it. When we use the assignment operator ( ) inside a function, its default behaviour is to create a new local variable – unless a variable with the same name is already defined in the local scope.

Here is an example of variables in different scopes:

# This is a global variable
a = 0

if a == 0:
    # This is still a global variable
    b = 1

def my_function(c):
    # this is a local variable
    d = 3
    print(c)
    print(d)

# Now we call the function, passing the value 7 as the first and only parameter
my_function(7)

# a and b still exist
print(a)
print(b)

# c and d don't exist anymore -- these statements will give us name errors!
print(c)
print(d)

Note

The inside of a class body is also a new local variable scope. Variables which are defined in the class body (but outside any class method) are called class attributes . They can be referenced by their bare names within the same scope, but they can also be accessed from outside this scope if we use the attribute access operator ( ) on a class or an instance (an object which uses that class as its type). An attribute can also be set explicitly on an instance or class from inside a method. Attributes set on instances are called instance attributes . Class attributes are shared between all instances of a class, but each instance has its own separate instance attributes. We will look at this in greater detail in the chapter about classes. The assignment operator

As we saw in the previous sections, the assignment operator in Python is a single equals sign ( ). This operator assigns the value on the right hand side to the variable on the left hand side, sometimes creating the variable first. If the right hand side is an expression (such as an arithmetic expression), it will be evaluated before the assignment occurs. Here are a few examples:

a_number = 5              # a_number becomes 5
a_number = total          # a_number becomes the value of total
a_number = total + 5      # a_number becomes the value of total + 5
a_number = a_number + 1   # a_number becomes the value of a_number + 1

The last statement might look a bit strange if we were to interpret as a mathematical equals sign – clearly a number cannot be equal to the same number plus one! Remember that is an assignment operator – this statement is assigning a new value to the variable a_number which is equal to the old value of a_number plus one.

Assigning an initial value to variable is called initialising the variable. In some languages defining a variable can be done in a separate step before the first value assignment. It is thus possible in those languages for a variable to be defined but not have a value – which could lead to errors or unexpected behaviour if we try to use the value before it has been assigned. In Python a variable is defined and assigned a value in a single step, so we will almost never encounter situations like this.

The left hand side of the assignment statement must be a valid target:

# this is fine:
a = 3

# these are all illegal:
3 = 4
3 = a
a + b = 3

An assignment statement may have multiple targets separated by equals signs. The expression on the right hand side of the last equals sign will be assigned to all the targets. All the targets must be valid:

# both a and b will be set to zero:
a = b = 0

# this is illegal, because we can't set 0 to b:
a = 0 = b
Compound assignment operators

We have already seen that we can assign the result of an arithmetic expression to a variable:

total = a + b + c + 50

Counting is something that is done often in a program. For example, we might want to keep count of how many times a certain event occurs by using a variable called count . We would initialise this variable to zero and add one to it every time the event occurs. We would perform the addition with this statement:

count = count + 1

This is in fact a very common operation. Python has a shorthand operator, += , which lets us express it more cleanly, without having to write the name of the variable twice:

# These statements mean exactly the same thing:
count = count + 1
count += 1

# We can increment a variable by any number we like.
count += 2
count += 7
count += a + b

There is a similar operator, -= , which lets us decrement numbers:

# These statements mean exactly the same thing:
count = count - 3
count -= 3

Other common compound assignment operators are given in the table below:

Operator Example Equivalent to
+= +=
-= -=
*= *=
/= /=
%= %=
More about scope: crossing boundaries

What if we want to access a global variable from inside a function? It is possible, but doing so comes with a few caveats:

a = 0

def my_function():
    print(a)

my_function()

The print statement will output , the value of the global variable , as you probably expected. But what about this program?

a = 0

def my_function():
    a = 3
    print(a)

my_function()

print(a)

When we call the function, the print statement inside outputs – but why does the print statement at the end of the program output ?

By default, the assignment statement creates variables in the local scope. So the assignment inside the function does not modify the global variable – it creates a new local variable called , and assigns the value to that variable. The first print statement outputs the value of the new local variable – because if a local variable has the same name as a global variable the local variable will always take precedence. The last print statement prints out the global variable, which has remained unchanged.

What if we really want to modify a global variable from inside a function? We can use the global keyword:

a = 0

def my_function():
    global a
    a = 3
    print(a)

my_function()

print(a)

We may not refer to both a global variable and a local variable by the same name inside the same function. This program will give us an error:

a = 0

def my_function():
    print(a)
    a = 3
    print(a)

my_function()

Because we haven't declared to be global, the assignment in the second line of the function will create a local variable . This means that we can't refer to the global variable elsewhere in the function, even before this line! The first print statement now refers to the local variable – but this variable doesn't have a value in the first line, because we haven't assigned it yet!

Note that it is usually very bad practice to access global variables from inside functions, and even worse practice to modify them. This makes it difficult to arrange our program into logically encapsulated parts which do not affect each other in unexpected ways. If a function needs to access some external value, we should pass the value into the function as a parameter. If the function is a method of an object, it is sometimes appropriate to make the value an attribute of the same object – we will discuss this in the chapter about object orientation.

Note

There is also a nonlocal keyword in Python – when we nest a function inside another function, it allows us to modify a variable in the outer function from inside the inner function (or, if the function is nested multiple times, a variable in one of the outer functions). If we use the global keyword, the assignment statement will create the variable in the global scope if it does not exist already. If we use the nonlocal keyword, however, the variable must be defined, because it is impossible for Python to determine in which scope it should be created. Exercise 1

  1. Describe the scope of the variables a , , and in this example:

    def my_function(a):
        b = a - 2
        return b
    
    c = 3
    
    if c > 2:
        d = my_function(5)
        print(d)
    
  2. What is the lifetime of these variables? When will they be created and destroyed?

  3. Can you guess what would happen if we were to assign a value of instead?

  4. Why would this be a problem? Can you think of a way to avoid it?

Modifying values Constants

In some languages, it is possible to define special variables which can be assigned a value only once – once their values have been set, they cannot be changed. We call these kinds of variables constants . Python does not allow us to set such a restriction on variables, but there is a widely used convention for marking certain variables to indicate that their values are not meant to change: we write their names in all caps, with underscores separating words:

# These variables are "constants" by convention:
NUMBER_OF_DAYS_IN_A_WEEK = 7
NUMBER_OF_MONTHS_IN_A_YEAR = 12

# Nothing is actually stopping us from redefining them...
NUMBER_OF_DAYS_IN_A_WEEK = 8

# ...but it's probably not a good idea.

Why do we bother defining variables that we don't intend to change? Consider this example:

MAXIMUM_MARK = 80

tom_mark = 58
print(("Tom's mark is %.2f%%" % (tom_mark / MAXIMUM_MARK * 100)))
# %% is how we escape a literal % inside a string

There are several good reasons to define MAXIMUM_MARK instead of just writing 80 inside the print statement. First, this gives the number a descriptive label which explains what it is – this makes the code more understandable. Second, we may eventually need to refer to this number in our program more than once. If we ever need to update our code with a new value for the maximum mark, we will only have to change it in one place, instead of finding every place where it is used – such replacements are often error-prone.

Literal numbers scattered throughout a program are known as "magic numbers" – using them is considered poor coding style. This does not apply to small numbers which are considered self-explanatory – it's easy to understand why a total is initialised to zero or incremented by one.

Sometimes we want to use a variable to distinguish between several discrete options. It is useful to refer to the option values using constants instead of using them directly if the values themselves have no intrinsic meaning:

# We define some options
LOWER, UPPER, CAPITAL = 1, 2, 3

name = "jane"
# We use our constants when assigning these values...
print_style = UPPER

# ...and when checking them:
if print_style == LOWER:
    print(name.lower())
elif print_style == UPPER:
    print(name.upper())
elif print_style == CAPITAL:
    print(name.capitalize())
else:
    # Nothing prevents us from accidentally setting print_style to 4, 90 or
    # "spoon", so we put in this fallback just in case:
    print("Unknown style option!")

In the above example, the values , and are not important – they are completely meaningless. We could equally well use , and or the strings 'lower' , 'upper' and 'capital' . The only important thing is that the three values must be different. If we used the numbers directly instead of the constants the program would be much more confusing to read. Using meaningful strings would make the code more readable, but we could accidentally make a spelling mistake while setting one of the values and not notice – if we mistype the name of one of the constants we are more likely to get an error straight away.

Some Python libraries define common constants for our convenience, for example:

# we need to import these libraries before we use them
import string
import math
import re

# All the lowercase ASCII letters: 'abcdefghijklmnopqrstuvwxyz'
print(string.ascii_lowercase)

# The mathematical constants pi and e, both floating-point numbers
print(math.pi) # ratio of circumference of a circle to its diameter
print(math.e) # natural base of logarithms

# This integer is an option which we can pass to functions in the re
# (regular expression) library.
print(re.IGNORECASE)

Note that many built-in constants don't follow the all-caps naming convention. Mutable and immutable types

Some values in python can be modified, and some cannot. This does not ever mean that we can't change the value of a variable – but if a variable contains a value of an immutable type , we can only assign it a new value . We cannot alter the existing value in any way.

Integers, floating-point numbers and strings are all immutable types – in all the previous examples, when we changed the values of existing variables we used the assignment operator to assign them new values:

a = 3
a = 2

b = "jane"
b = "bob"

Even this operator doesn't modify the value of total in-place – it also assigns a new value:

total += 4

We haven't encountered any mutable types yet, but we will use them extensively in later chapters. Lists and dictionaries are mutable, and so are most objects that we are likely to write ourselves:

# this is a list of numbers
my_list = [1, 2, 3]
my_list[0] = 5 # we can change just the first element of the list
print(my_list)

class MyClass(object):
    pass # this is a very silly class

# Now we make a very simple object using our class as a type
my_object = MyClass()

# We can change the values of attributes on the object
my_object.some_property = 42
More about input

In the earlier sections of this unit we learned how to make a program display a message using the print function or read a string value from the user using the input function. What if we want the user to input numbers or other types of variables? We still use the input function, but we must convert the string values returned by input to the types that we want. Here is a simple example:

height = int(input("Enter height of rectangle: "))
width = int(input("Enter width of rectangle: "))

print("The area of the rectangle is %d" % (width * height))

int is a function which converts values of various types to ints. We will discuss type conversion in greater detail in the next section, but for now it is important to know that int will not be able to convert a string to an integer if it contains anything except digits. The program above will exit with an error if the user enters "aaa" , "zzz10" or even "7.5" . When we write a program which relies on user input, which can be incorrect, we need to add some safeguards so that we can recover if the user makes a mistake. For example, we can detect if the user entered bad input and exit with a nicer error message:

try:
    height = int(input("Enter height of rectangle: "))
    width = int(input("Enter width of rectangle: "))
except ValueError as e: # if a value error occurs, we will skip to this point
    print("Error reading height and width: %s" % e)

This program will still only attempt to read in the input once, and exit if it is incorrect. If we want to keep asking the user for input until it is correct, we can do something like this:

correct_input = False # this is a boolean value -- it can be either true or false.

while not correct_input: # this is a while loop
    try:
        height = int(input("Enter height of rectangle: "))
        width = int(input("Enter width of rectangle: "))
    except ValueError:
        print("Please enter valid integers for the height and width.")
    else: # this will be executed if there is no value error
        correct_input = True

We will learn more about boolean values, loops and exceptions later. Example: calculating petrol consumption of a car

In this example, we will write a simple program which asks the user for the distance travelled by a car, and the monetary value of the petrol that was used to cover that distance. From this information, together with the price per litre of petrol, the program will calculate the efficiency of the car, both in litres per 100 kilometres and and kilometres per litre.

First we will define the petrol price as a constant at the top. This will make it easy for us to update the price when it changes on the first Wednesday of every month:

PETROL_PRICE_PER_LITRE = 4.50

When the program starts,we want to print out a welcome message:

print("*** Welcome to the fuel efficiency calculator! ***\n")
# we add an extra blank line after the message with \n

Ask the user for his or her name:

name = input("Enter your name: ")

Ask the user for the distance travelled:

# float is a function which converts values to floating-point numbers.
distance_travelled = float(input("Enter distance travelled in km: "))

Then ask the user for the amount paid:

amount_paid = float(input("Enter monetary value of fuel bought for the trip: R"))

Now we will do the calculations:

fuel_consumed = amount_paid / PETROL_PRICE_PER_LITRE

efficiency_l_per_100_km = fuel_consumed / distance_travelled * 100
efficiency_km_per_l = distance_travelled / fuel_consumed

Finally, we output the results:

print("Hi, %s!" % name)
print("Your car's efficiency is %.2f litres per 100 km." % efficiency_l_per_100_km)
print("This means that you can travel %.2f km on a litre of petrol." % efficiency_km_per_l)

# we add an extra blank line before the message with \n
print("\nThanks for using the program.")
Exercise 2
  1. Write a Python program to convert a temperature given in degrees Fahrenheit to its equivalent in degrees Celsius. You can assume that T_c = (5/9) x (T_f - 32) , where T_c is the temperature in °C and T_f is the temperature in °F. Your program should ask the user for an input value, and print the output. The input and output values should be floating-point numbers.
  2. What could make this program crash? What would we need to do to handle this situation more gracefully?
Type conversion

As we write more programs, we will often find that we need to convert data from one type to another, for example from a string to an integer or from an integer to a floating-point number. There are two kinds of type conversions in Python: implicit and explicit conversions.

Implicit conversion

Recall from the section about floating-point operators that we can arbitrarily combine integers and floating-point numbers in an arithmetic expression – and that the result of any such expression will always be a floating-point number. This is because Python will convert the integers to floating-point numbers before evaluating the expression. This is an implicit conversion – we don't have to convert anything ourselves. There is usually no loss of precision when an integer is converted to a floating-point number.

For example, the integer will automatically be converted to a floating-point number in the following example:

result = 8.5 * 2

8.5 is a float while is an int . Python will automatically convert operands so that they are of the same type. In this case this is achieved if the integer is converted to the floating-point equivalent 2.0 . Then the two floating-point numbers can be multiplied.

Let's have a look at a more complex example:

result = 8.5 + 7 // 3 - 2.5

Python performs operations according to the order of precedence, and decides whether a conversion is needed on a per-operation basis. In our example // has the highest precedence, so it will be processed first. and are both integers and // is the integer division operator – the result of this operation is the integer . Now we are left with 8.5 2.5 . The addition and subtraction are at the same level of precedence, so they are evaluated left-to-right, starting with addition. First is converted to the floating-point number 2.0 , and the two floating-point numbers are added, which leaves us with 10.5 2.5 . The result of this floating-point subtraction is 2.0 , which is assigned to result . Explicit conversion

Converting numbers from float to int will result in a loss of precision. For example, try to convert 5.834 to an int – it is not possible to do this without losing precision. In order for this to happen, we must explicitly tell Python that we are aware that precision will be lost. For example, we need to tell the compiler to convert a float to an int like this:

i = int(5.834)

The int function converts a float to an int by discarding the fractional part – it will always round down! If we want more control over the way in which the number is rounded, we will need to use a different function:

# the floor and ceil functions are in the math module
import math

# ceil returns the closest integer greater than or equal to the number
# (so it always rounds up)
i = math.ceil(5.834)

# floor returns the closest integer less than or equal to the number
# (so it always rounds down)
i = math.floor(5.834)

# round returns the closest integer to the number
# (so it rounds up or down)
# Note that this is a built-in function -- we don't need to import math to use it.
i = round(5.834)

Explicit conversion is sometimes also called casting – we may read about a float being cast to int or vice-versa. Converting to and from strings

As we saw in the earlier sections, Python seldom performs implicit conversions to and from str – we usually have to convert values explicitly. If we pass a single number (or any other value) to the print function, it will be converted to a string automatically – but if we try to add a number and a string, we will get an error:

# This is OK
print(5)
print(6.7)

# This is not OK
print("3" + 4)

# Do you mean this...
print("3%d" % 4) # concatenate "3" and "4" to get "34"

# Or this?
print(int("3") + 4) # add 3 and 4 to get 7

To convert numbers to strings, we can use string formatting – this is usually the cleanest and most readable way to insert multiple values into a message. If we want to convert a single number to a string, we can also use the str function explicitly:

# These lines will do the same thing
print("3%d" % 4)
print("3" + str(4))
More about conversions

In Python, functions like str , int and float will try to convert anything to their respective types – for example, we can use the int function to convert strings to integers or to convert floating-point numbers to integers. Note that although int can convert a float to an integer it can't convert a string containing a float to an integer directly!

# This is OK
int("3")

# This is OK
int(3.7)

# This is not OK
int("3.7") # This is a string representation of a float, not an integer!

# We have to convert the string to a float first
int(float("3.7"))

Values of type bool can contain the value True or False . These values are used extensively in conditional statements, which execute or do not execute parts of our program depending on some binary condition:

my_flag = True

if my_flag:
    print("Hello!")

The condition is often an expression which evaluates to a boolean value:

if 3 > 4:
    print("This will not be printed.")

However, almost any value can implicitly be converted to a boolean if it is used in a statement like this:

my_number = 3

if my_number:
    print("My number is non-zero!")

This usually behaves in the way that you would expect: non-zero numbers are True values and zero is False . However, we need to be careful when using strings – the empty string is treated as False , but any other string is True – even "0" and "False" !

# bool is a function which converts values to booleans
bool(34) # True
bool(0) # False
bool(1) # True

bool("") # False
bool("Jane") # True
bool("0") # True!
bool("False") # Also True!
Exercise 3
  1. Convert "8.8" to a float.
  2. Convert 8.8 to an integer (with rounding).
  3. Convert "8.8" to an integer (with rounding).
  4. Convert 8.8 to a string.
  5. Convert to a string.
  6. Convert to a float.
  7. Convert to a boolean.
Answers to exercises Answer to exercise 1
  1. is a local variable in the scope of my_function because it is an argument name. is also a local variable inside my_function , because it is assigned a value inside my_function . and are both global variables. It doesn't matter that is created inside an if block, because the inside of an if block is not a new scope – everything inside the block is part of the same scope as the outside (in this case the global scope). Only function definitions (which start with def ) and class definitions (which start with class ) indicate the start of a new level of scope.
  2. Both and will be created every time my_function is called and destroyed when my_function has finished executing. is created when it is assigned the value , and exists for the remainder of the program's execution. is created inside the if block (when it is assigned the value which is returned from the function), and also exists for the remainder of the program's execution.
  3. As we will learn in the next chapter, if blocks are executed conditionally . If were not greater than in this program, the if block would not be executed, and if that were to happen the variable would never be created.
  4. We may use the variable later in the code, assuming that it always exists, and have our program crash unexpectedly if it doesn't. It is considered poor coding practice to allow a variable to be defined or undefined depending on the outcome of a conditional statement. It is better to ensure that is always defined, no matter what – for example, by assigning it some default value at the start. It is much easier and cleaner to check if a variable has the default value than to check whether it exists at all.
Answer to exercise 2
  1. Here is an example program:

    T_f = float(input("Please enter a temperature in °F: "))
    T_c = (5/9) * (T_f - 32)
    print("%g°F = %g°C" % (T_f, T_c))
    

    Note

    The formatting symbol %g is used with floats, and instructs Python to pick a sensible human-readable way to display the float.

  2. The program could crash if the user enters a value which cannot be converted to a floating-point number. We would need to add some kind of error checking to make sure that this doesn't happen – for example, by storing the string value and checking its contents. If we find that the entered value is invalid, we can either print an error message and exit or keep prompting the user for input until valid input is entered.

Answer to exercise 3

Here are example answers:

import math

a_1 = float("8.8")
a_2 = math.round(8.8)
a_3 = math.round("8.8")
a_4 = "%g" % 8.8
a_5 = "%d" % 8
a_6 = float(8)
a_7 = bool(8)
Next Previous
© Copyright 2013, 2014, University of Cape Town and individual contributors. This work is released under the CC BY-SA 4.0 licence. Revision 8e685e710775 . Built with Sphinx using a theme provided by Read the Docs .

[Dec 07, 2017] BitManipulation - Python Wiki

Dec 07, 2017 | wiki.python.org

Here is some information and goals related to Python bit manipulation, binary manipulation.

Some tasks include:

Relevant libraries include:

Some simple code is at ASPN: bit-field manipulation.

Here are some other examples.

Manipulations

To integer.

Toggle line numbers
   1 >>> print int('00100001', 2)
   2 33

To hex string. Note that you don't need to use x8 bits.

Toggle line numbers
   1 >>> print "0x%x" % int('11111111', 2)
   2 0xff
   3 >>> print "0x%x" % int('0110110110', 2)
   4 0x1b6
   5 >>> print "0x%x" % int('0010101110101100111010101101010111110101010101', 2)
   6 0xaeb3ab57d55

To character. 8 bits max.

Toggle line numbers
   1 >>> chr(int('111011', 2))
   2 ';'
   3 >>> chr(int('1110110', 2))
   4 'v'
   5 >>> chr(int('11101101', 2))
   6 '\xed'

Characters to integers, but not to strings of 1's and 0's.

Toggle line numbers
   1 >>> int('01110101', 2)
   2 117
   3 >>> chr(int('01110101', 2))
   4 'u'
   5 >>> ord('u')
   6 117

Individual bits.

Toggle line numbers
   1 >>> 1 << 0
   2 1
   3 >>> 1 << 1
   4 2
   5 >>> 1 << 2
   6 4
   7 >>> 1 << 3
   8 8
   9 >>> 1 << 4
  10 16
  11 >>> 1 << 5
  12 32
  13 >>> 1 << 6
  14 64
  15 >>> 1 << 7
  16 128
Transformations Summary

Strings to Integers:

Integers to Strings:

We are still left without a technique for producing binary strings, and decyphering hex strings.

Hex String to Integer

Use the int type with the base argument:

Toggle line numbers
   1 >>> int('0xff',16)
   2 255
   3 >>> int('d484fa894e',16)
   4 912764078414

Do not use alternatives that utilize eval. eval will execute code passed to it and can thus compromise the security of your program.

Integer to Bin String

Python 3 supports binary literals (e.g. 0b10011000) and has a bin() function. For older versions:

Toggle line numbers
   1 >>> def bin(a):
   2         s=''
   3         t={'0':'000','1':'001','2':'010','3':'011',
   4            '4':'100','5':'101','6':'110','7':'111'}
   5         for c in oct(a)[1:]:
   6                 s+=t[c]
   7         return s

or better:

Toggle line numbers
   1 def bin(s):
   2     return str(s) if s<=1 else bin(s>>1) + str(s&1)
Python Integers

From "The Python Language Reference" page on the Data Model:

"Integers (int) These represent numbers in an unlimited range, subject to available (virtual) memory only. For the purpose of shift and mask operations, a binary representation is assumed, and negative numbers are represented in a variant of 2's complement which gives the illusion of an infinite string of sign bits extending to the left."

Prior to Python 3.1, there was no easy way to determine how Python represented a specific integer internally, i.e. how many bits were used. Python 3.1 adds a bit_length() method to the int type that does exactly that.

Unless you know you are working with numbers that are less than a certain length, for instance numbers from arrays of integers, shifts, rotations, etc. may give unexpected results.

The number of the highest bit set is the highest power of 2 less than or equal to the input integer. This is the same as the exponent of the floating point representation of the integer, and is also called its "integer log base 2".(ref.1)

In versions before 3.1, the easiest way to determine the highest bit set is*:

* There is a long discussion on this topic, and why this method is not good, in "Issue 3439" at Python.org: http://bugs.python.org/issue3439 This discussion led up to the addition of bit_length() in Python 3.1.

Toggle line numbers
   1 import math
   2 
   3 hiBit = math.floor(math.log(int_type, 2))

An input less than or equal to 0 results in a " ValueError : math domain error"

The section "Finding integer log base 2 of an integer" on the "Bit Twiddling Hacks"(ref.1) web page includes a number of methods for determining this value for integers of known magnitude, presumably when no math coprocessor is available. The only method generally applicable to Python integers of unknown magnitude is the "obvious way" of counting the number of bitwise shift operations needed to reduce the input to 0.

Bit Length Of a Python Integer

bitLen() counts the actual bit length of a Python integer, that is, the number of the highest non-zero bit plus 1 . Zero, with no non-zero bit, returns 0. As should be expected from the quote above about "the illusion of an infinite string of sign bits extending to the left," a negative number throws the computer into an infinite loop.

The function can return any result up to the length of the largest integer your computer's memory can hold.

Toggle line numbers
   1 def bitLen(int_type):
   2     length = 0
   3     while (int_type):
   4         int_type >>= 1
   5         length += 1
   6     return(length)
   7 
   8 for i in range(17):
   9      print(bitLen(i))
  10 
  11 # results: 0, 1, 2, 2, 3, 3, 3, 3, 4, 4, 4, 4, 4, 4, 4, 4, 5

The method using the math module is much faster, especially on huge numbers with hundreds of decimal digits.

bitLenCount()

In common usage, the "bit count" of an integer is the number of set (1) bits, not the bit length of the integer described above. bitLen() can be modified to also provide the count of the number of set bits in the integer. There are faster methods to get the count below.

Toggle line numbers
   1 def bitLenCount(int_type):
   2     length = 0
   3     count = 0
   4     while (int_type):
   5         count += (int_type & 1)
   6         length += 1
   7         int_type >>= 1
   8     return(length, count)
Operations on Integers of Unknown Magnitude

Some procedures don't need to know the magnitude of an integer to give meaningful results.

bitCount()

The procedure and the information below were found in "Bit Twiddling Hacks"(ref.1)

Counting bits set, Brian Kernighan's way*

Toggle line numbers
   1 unsigned int v;          // count the number of bits set in v
   2 unsigned int c;          // c accumulates the total bits set in v
   3 for (c = 0; v; c++)
   4 {   v &= v - 1;  }       //clear the least significant bit set

This method goes through as many iterations as there are set bits. So if we have a 32-bit word with only the high bit set, then it will only go once through the loop.

* The C Programming Language 2nd Ed., Kernighan & Ritchie, 1988.

Don Knuth pointed out that this method was published by Peter Wegner in CACM 3 (1960), 322. Also discovered independently by Derrick Lehmer and published in 1964 in a book edited by Beckenbach.

Kernighan and Knuth, potent endorsements!

This works because each subtraction "borrows" from the lowest 1-bit. For example:

Toggle line numbers
   1 #       loop pass 1                 loop pass 2
   2 #      101000     101000           100000     100000
   3 #    -   #!python
   4   & 100111         -   #!python
   5   & 011111
   6 #    = 100111   = 100000         = 011111   =      0

It is an excellent technique for Python, since the size of the integer need not be determined beforehand.

Toggle line numbers
   1 def bitCount(int_type):
   2     count = 0
   3     while(int_type):
   4         int_type &= int_type - 1
   5         count += 1
   6     return(count)
parityOf()

From "Bit Twiddling Hacks"

Code almost identical to bitCount(), above, calculates the parity of an integer, returning 0 if there are an even number of set bits, and -1 if there are an odd number. In fact, counting the bits and checking whether the result is odd with bitcount & 1 is about the same speed as the parity function.

Toggle line numbers
   1 def parityOf(int_type):
   2     parity = 0
   3     while (int_type):
   4         parity = ~parity
   5         int_type = int_type & (int_type - 1)
   6     return(parity)
lowestSet()

To determine the bit number of the lowest bit set in an integer, in twos-complement notation i & -i zeroes all but the lowest set bit. The bitLen() proceedure then determines its position. Obviously, negative numbers return the same result as their opposite. In this version, an input of 0 returns -1, in effect an error condition.

Toggle line numbers
   1 For example:
   2 #    00111000     # 56
   3 #    11001000     # twos complement, -56
   4 # &= 00001000
Toggle line numbers
   1 def lowestSet(int_type):
   2     low = (int_type & -int_type)
   3     lowBit = -1
   4     while (low):
   5         low >>= 1
   6         lowBit += 1
   7     return(lowBit)
Single bits

The usual single-bit operations will work on any Python integer. It is up to the programmer to be sure that the value of 'offset' makes sense in the context of the program.

Toggle line numbers
   1 # testBit() returns a nonzero result, 2**offset, if the bit at 'offset' is one.
   2 
   3 def testBit(int_type, offset):
   4     mask = 1 << offset
   5     return(int_type & mask)
   6 
   7 # setBit() returns an integer with the bit at 'offset' set to 1.
   8 
   9 def setBit(int_type, offset):
  10     mask = 1 << offset
  11     return(int_type | mask)
  12 
  13 # clearBit() returns an integer with the bit at 'offset' cleared.
  14 
  15 def clearBit(int_type, offset):
  16     mask = ~(1 << offset)
  17     return(int_type & mask)
  18 
  19 # toggleBit() returns an integer with the bit at 'offset' inverted, 0 -> 1 and 1 -> 0.
  20 
  21 def toggleBit(int_type, offset):
  22     mask = 1 << offset
  23     return(int_type ^ mask)
Bit fields, e.g. for communication protocols

If you need to interpret individual bits in some data, e.g. a byte stream in a communications protocol, you can use the ctypes module.

Toggle line numbers
   1 import ctypes
   2 c_uint8 = ctypes.c_uint8
   3 
   4 class Flags_bits( ctypes.LittleEndianStructure ):
   5     _fields_ = [
   6                 ("logout",     c_uint8, 1 ),  # asByte & 1
   7                 ("userswitch", c_uint8, 1 ),  # asByte & 2
   8                 ("suspend",    c_uint8, 1 ),  # asByte & 4
   9                 ("idle",       c_uint8, 1 ),  # asByte & 8
  10                ]
  11 
  12 class Flags( ctypes.Union ):
  13     _anonymous_ = ("bit",)
  14     _fields_ = [
  15                 ("bit",    Flags_bits ),
  16                 ("asByte", c_uint8    )
  17                ]
  18 
  19 flags = Flags()
  20 flags.asByte = 0x2  # ->0010
  21 
  22 print( "logout: %i"      % flags.bit.logout   )
  23 # `bit` is defined as anonymous field, so its fields can also be accessed directly:
  24 print( "logout: %i"      % flags.logout     )
  25 print( "userswitch:  %i" % flags.userswitch )
  26 print( "suspend   :  %i" % flags.suspend    )
  27 print( "idle  : %i"      % flags.idle       )
Toggle line numbers
   1 >>> 
   2 logout: 0
   3 logout: 0
   4 userswitch:  1
   5 suspend   :  0
   6 idle  : 0
References

ref.1. "Bit Twiddling Hacks" By Sean Eron Anderson

ref.2. "The Art of Assembly Language" by Randall Hyde

ref.3. Hacker's Delight

[Dec 07, 2017] foobarnbaz.com - Understanding Python variables and Memory Management

Dec 07, 2017 | foobarnbaz.com

Understanding Python variables and Memory Management Jul 08, 2012

Have you ever noticed any difference between variables in Python and C? For example, when you do an assignment like the following in C, it actually creates a block of memory space so that it can hold the value for that variable.

int a = 1;

You can think of it as putting the value assigned in a box with the variable name as shown below.

int a =1;

And for all the variables you create a new box is created with the variable name to hold the value. If you change the value of the variable the box will be updated with the new value. That means doing

a = 2;

will result in

a = 2;

Assigning one variable to another makes a copy of the value and put that value in the new box.

int b = a;

b=2 a = 2

But in Python variables work more like tags unlike the boxes you have seen before. When you do an assignment in Python, it tags the value with the variable name.

a = 1

a = 1

and if you change the value of the varaible, it just changes the tag to the new value in memory. You dont need to do the housekeeping job of freeing the memory here. Python's Automatic Garbage Collection does it for you. When a value is without names/tags it is automatically removed from memory.

a = 2

a = 2 1

Assigning one variable to another makes a new tag bound to the same value as show below.

b = a

b = a

Other languages have 'variables'. Python has 'names'.

A bit about Python's memory management

As you have seen before, a value will have only one copy in memory and all the variables having this value will refer to this memory location. For example when you have variables a , b , c having a value 10, it doesn't mean that there will be 3 copy of 10 s in memory. There will be only one 10 and all the variables a , b , c will point to this value. Once a variable is updated, say you are doing a += 1 a new value 11 will be allocated in memory and a will be pointing to this.

Let's check this behaviour with Python Interpreter. Start the Python Shell and try the following for yourselves.

>>> a = 10
>>> b = 10
>>> c = 10
>>> id(a), id(b), id(c)
(140621897573616, 140621897573616, 140621897573616)
>>> a += 1
>>> id(a)
140621897573592

id() will return an objects memory address (object's identity). As you have noticed, when you assign the same integer value to the variables, we see the same ids. But this assumption does not hold true all the time. See the following for example

>>> x = 500
>>> y = 500
>>> id(x)
4338740848
>>> id(y)
4338741040

What happened here? Even after assigning the same integer values to different variable names, we are getting two different ids here. These are actually the effects of CPython optimization we are observing here. CPython implementation keeps an array of integer objects for all integers between -5 and 256. So when we create an integer in that range, they simply back reference to the existing object. You may refer the following links for more information.

Stack Overflow: "is" operator behaves unexpectedly with integers

Let's take a look at strings now.

>>> s1 = 'hello'
>>> s2 = 'hello'
>>> id(s1), id(s2)
(4454725888, 4454725888)
>>> s1 == s2
True
>>> s1 is s2
True
>>> s3 = 'hello, world!'
>>> s4 = 'hello, world!'
>>> id(s3), id(s4)
(4454721608, 4454721664)
>>> s3 == s4
True
>>> s3 is s4
False

Looks interesting, isn't it? When the string was a simple and shorter one the variable names where referring to the same object in memory. But when they became bigger, this was not the case. This is called interning, and Python does interning (to some extent) of shorter string literals (as in s1 and s2 ) which are created at compile time. But in general, Python string literals creates a new string object each time (as in s3 and s4 ). Interning is runtime dependant and is always a trade-off between memory use and the cost of checking if you are creating the same string. There's a built-in intern() function to forcefully apply interning. Read more about interning from the following links.

Stack Overflow: Does Python intern Strings?
Stack Overflow: Python String Interning
Internals of Python String Interning

Now we will try to create custom objects and try to find their identities.

>>> class Foo:
...     pass
...
>>> bar = Foo()
>>> baz = Foo()
>>> id(bar)
140730612513248
>>> id(baz)
140730612513320

As you can see, the two instances have different identities. That means, there are two different copies of the same object in memory. When you are creating custom objects, they will have unique identities unless you are using Singleton Pattern which overrides this behaviour (in __new__() ) by giving out the same instance upon instance creation.

Thanks to Jay Pablo (see comments) for correcting the mistakes and making this post a better one.

[Dec 07, 2017] In what structure is a Python object stored in memory - Stack Overflow

Dec 07, 2017 | stackoverflow.com

In what structure is a Python object stored in memory? [duplicate] Ask Question up vote down vote favorite 1

; ,Nov 1, 2010 at 4:34

Possible Duplicate:
How do I determine the size of an object in Python?

Say I have a class A:

class A(object):
    def __init__(self, x):
        self.x = x

    def __str__(self):
        return self.x

And I use sys.getsizeof to see how many bytes instance of A takes:

>>> sys.getsizeof(A(1))
64
>>> sys.getsizeof(A('a'))
64
>>> sys.getsizeof(A('aaa'))
64

As illustrated in the experiment above, the size of an A object is the same no matter what self.x is.

So I wonder how python store an object internally?

Björn Pollex ,Oct 31, 2010 at 10:38

This is certain to differ over python implementations. Which one are you talking about? – Björn Pollex Oct 31 '10 at 10:38

Thomas Wouters ,Oct 31, 2010 at 11:26

It depends on what kind of object, and also which Python implementation :-)

In CPython, which is what most people use when they use python , all Python objects are represented by a C struct, PyObject . Everything that 'stores an object' really stores a PyObject * . The PyObject struct holds the bare minimum information: the object's type (a pointer to another PyObject ) and its reference count (an ssize_t -sized integer.) Types defined in C extend this struct with extra information they need to store in the object itself, and sometimes allocate extra data separately.

For example, tuples (implemented as a PyTupleObject "extending" a PyObject struct) store their length and the PyObject pointers they contain inside the struct itself (the struct contains a 1-length array in the definition, but the implementation allocates a block of memory of the right size to hold the PyTupleObject struct plus exactly as many items as the tuple should hold.) The same way, strings ( PyStringObject ) store their length, their cached hashvalue, some string-caching ("interning") bookkeeping, and the actual char* of their data. Tuples and strings are thus single blocks of memory.

On the other hand, lists ( PyListObject ) store their length, a PyObject ** for their data and another ssize_t to keep track of how much room they allocated for the data. Because Python stores PyObject pointers everywhere, you can't grow a PyObject struct once it's allocated -- doing so may require the struct to move, which would mean finding all pointers and updating them. Because a list may need to grow, it has to allocate the data separately from the PyObject struct. Tuples and strings cannot grow, and so they don't need this. Dicts ( PyDictObject ) work the same way, although they store the key, the value and the cached hashvalue of the key, instead of just the items. Dict also have some extra overhead to accommodate small dicts and specialized lookup functions.

But these are all types in C, and you can usually see how much memory they would use just by looking at the C source. Instances of classes defined in Python rather than C are not so easy. The simplest case, instances of classic classes, is not so difficult: it's a PyObject that stores a PyObject * to its class (which is not the same thing as the type stored in the PyObject struct already), a PyObject * to its __dict__ attribute (which holds all other instance attributes) and a PyObject * to its weakreflist (which is used by the weakref module, and only initialized if necessary.) The instance's __dict__ is usually unique to the instance, so when calculating the "memory size" of such an instance you usually want to count the size of the attribute dict as well. But it doesn't have to be specific to the instance! __dict__ can be assigned to just fine.

New-style classes complicate manners. Unlike with classic classes, instances of new-style classes are not separate C types, so they do not need to store the object's class separately. They do have room for the __dict__ and weakreflist reference, but unlike classic instances they don't require the __dict__ attribute for arbitrary attributes. if the class (and all its baseclasses) use __slots__ to define a strict set of attributes, and none of those attributes is named __dict__ , the instance does not allow arbitrary attributes and no dict is allocated. On the other hand, attributes defined by __slots__ have to be stored somewhere . This is done by storing the PyObject pointers for the values of those attributes directly in the PyObject struct, much like is done with types written in C. Each entry in __slots__ will thus take up a PyObject * , regardless of whether the attribute is set or not.

All that said, the problem remains that since everything in Python is an object and everything that holds an object just holds a reference, it's sometimes very difficult to draw the line between objects. Two objects can refer to the same bit of data. They may hold the only two references to that data. Getting rid of both objects also gets rid of the data. Do they both own the data? Does only one of them, but if so, which one? Or would you say they own half the data, even though getting rid of one object doesn't release half the data? Weakrefs can make this even more complicated: two objects can refer to the same data, but deleting one of the objects may cause the other object to also get rid of its reference to that data, causing the data to be cleaned up after all.

Fortunately the common case is fairly easy to figure out. There are memory debuggers for Python that do a reasonable job at keeping track of these things, like heapy . And as long as your class (and its baseclasses) is reasonably simple, you can make an educated guess at how much memory it would take up -- especially in large numbers. If you really want to know the exact sizes of your datastructures, consult the CPython source; most builtin types are simple structs described in Include/<type>object.h and implemented in Objects/<type>object.c . The PyObject struct itself is described in Include/object.h . Just keep in mind: it's pointers all the way down; those take up room too.

satoru ,Oct 31, 2010 at 12:43

Thanks very much. In fact, I'm asking this question because I want to know what's stored in memcached when I invoke cache.set(key, obj) , is it some thing like a pickled object? – satoru Oct 31 '10 at 12:43

Thomas Wouters ,Oct 31, 2010 at 16:00

Oh, well! That's a completely different question. As I recall (and a quick glance at the source confirms), the memcache module stores pickled versions of the object, yes. It also creates a new pickler for each store, so storing two objects that refer to the same third object means the third object is pickled twice (unless your objects don't pickle that way, of course; you can define pickling exactly how you want.) In other words, the answer to your question is 'len(pickle.dumps(obj))' . – Thomas Wouters Oct 31 '10 at 16:00

tmthydvnprt ,Mar 13, 2016 at 13:31

For the graphically curious, I once tested and plotted this for multiple builtin types: stackoverflow.com/a/30008338/2087463tmthydvnprt Mar 13 '16 at 13:31

> ,

in the case of a new class instance getsizeof() return the size of a reference to PyObject which is returned by the C function PyInstance_New()

if you want a list of all the object size check this .

[Dec 05, 2017] python - Problems installing python3 on RHEL - Stack Overflow

Dec 05, 2017 | stackoverflow.com

gecco ,Nov 13, 2011 at 13:53

It is easy to install it manually:
  1. Download (there may be newer releases on Python.org ):
    $ wget https://www.python.org/ftp/python/3.4.3/Python-3.4.3.tar.xz
  2. Unzip
    $ tar xf Python-3.* 
    $ cd Python-3.*
  3. Prepare compilation
    $ ./configure
  4. Build
    $ make
  5. Install
    $ make install

    OR if you don't want to overwrite the python executable (safer, at least on some distros yum needs python to be 2.x, such as for RHEL6) - you can install python3.* as a concurrent instance to the system default with an altinstall :

    $ make altinstall

Now if you want an alternative installation directory, you can pass --prefix to the configure command.

Example: for 'installing' Python in /opt/local, just add --prefix=/opt/local .

After the make install step: In order to use your new Python installation, it could be, that you still have to add the [prefix]/bin to the $PATH and [prefix]/lib to the $LD_LIBRARY_PATH (depending of the --prefix you passed)

rajadhiraja ,Jul 9, 2012 at 17:58

You used: bzip2 -cd Python-3.2.2.tar.bz2 | tar xvf - This is also a simpler possibility: tar jxvf Python-3.2.2.tar.bz2 – rajadhiraja Jul 9 '12 at 17:58

dannysauer ,Oct 29, 2014 at 21:38

The bzip2 option to tar was -y on some early systems, before bzip2 was "officially" supported, and some systems that don't use GNU tar don't even have bzip2 support built-in (but may have bzip2 binaries). So depending on how portable things need to be, the bunzip2 -c command (or bzip2 -cd ) may be more portable. RHEL6, as in teh question, supports -j , so this is moot for the actual question. But for posterity... – dannysauer Oct 29 '14 at 21:38

Caleb ,Jan 8, 2015 at 20:39

I got a 301 (moved) into a 404 when using the bz2 tar. I changed it to .tgz and it downloaded fine. – Caleb Jan 8 '15 at 20:39

bnu ,Jun 3, 2016 at 13:10

if you get no acceptable C compiler found in $PATH when installing python refer to http://stackoverflow.com/questions/19816275/no-acceptable-c-‌​compiler-found-in-pa‌​th-when-installing-p‌​ythonbnu Jun 3 '16 at 13:10

Searene ,Nov 20, 2016 at 3:44

./configure --with-ensurepip=install to enable pip3 , or you won't have pip3 installed after compilation. – Searene Nov 20 '16 at 3:44

Samuel Phan ,Apr 26, 2014 at 23:30

Installing from RPM is generally better, because: Solution 1: Red Hat & EPEL repositories

Red Hat has added Python 3.4 for CentOS 6 and 7 through the EPEL repository.

Unfortunately:

[EPEL] How to install Python 3.4 on CentOS 6 & 7
sudo yum install -y epel-release
sudo yum install -y python34

# Install pip3
sudo yum install -y python34-setuptools  # install easy_install-3.4
sudo easy_install-3.4 pip

# I guess you would like to install virtualenv or virtualenvwrapper
sudo pip3 install virtualenv
sudo pip3 install virtualenvwrapper

If you want to use pyvenv , you can do the following to install pip3 in your virtualenv:

pyvenv --without-pip my_env
curl https://bootstrap.pypa.io/get-pip.py | my_env/bin/python

But if you want to have it out-of-the-box, you can add this bash function (alias) in your .bashrc :

pyvenv() { /usr/bin/pyvenv --without-pip $@; for env in $@; do curl https://bootstrap.pypa.io/get-pip.py | "$env/bin/python"; done; }
Solution 2: IUS Community repositories

The IUS Community provides some up-to-date packages for RHEL & CentOS . The guys behind are from Rackspace, so I think that they are quite trustworthy...

https://ius.io/

Check the right repo for you here:

https://ius.io/GettingStarted/

[IUS] How to install Python 3.5 on CentOS 6
sudo yum install -y https://centos6.iuscommunity.org/ius-release.rpm
sudo yum install -y python35u python35u-pip

# I guess you would like to install virtualenv or virtualenvwrapper
sudo pip3.5 install virtualenv
sudo pip3.5 install virtualenvwrapper

Note: you have pyvenv-3.5 available out-of-the-box if you don't want to use virtualenv .

[IUS] How to install Python 3.5 on CentOS 7
sudo yum install -y https://centos7.iuscommunity.org/ius-release.rpm
sudo yum install -y python35u python35u-pip

# I guess you would like to install virtualenv or virtualenvwrapper
sudo pip3.5 install virtualenv
sudo pip3.5 install virtualenvwrapper

Note: you have pyvenv-3.5 available out-of-the-box if you don't want to use virtualenv .

Samuel Phan ,Jul 3, 2015 at 14:54

Fixed the IUS release package URL. they have updated the version, that's all. If they update the package again, you can check the link to their RPM from the webpage. – Samuel Phan Jul 3 '15 at 14:54

Samuel Phan ,Sep 7, 2015 at 9:01

As I said, the link in your answer contains non-printable unicode characters. When I copy/paste your link, here is what I see in VIM: https://dl.iuscommunity.org/pub/ius/stable/CentOS/6/x86_64/i‌​u<200c><200b>s-relea‌​se-1.0-14.iu‌​s.cent‌​os6.noarch.rpm Here is the unicode character: fileformat.info/info/unicode/char/200c/index.htm The URL in my original answer works, I've just tested it. – Samuel Phan Sep 7 '15 at 9:01

Loïc ,Sep 30, 2015 at 13:48

Using this solution, how would you then install pip for python34 ? – Loïc Sep 30 '15 at 13:48

Samuel Phan ,Oct 1, 2015 at 21:11

Very good question, I added a comment for that. It's the best I found. If you want to stick to RPM-based installation, you should use IUS repositories for CentOS 7. They provide a python34u-pip . – Samuel Phan Oct 1 '15 at 21:11

ILMostro_7 ,May 5 at 2:27

easy_install pip3 should work--or a variation of it--to get pip3 installed without needing to curl a specific URL that may or may not be there (anymore). – ILMostro_7 May 5 at 2:27

rsc ,Jul 29, 2012 at 11:15

In addition to gecco's answer I would change step 3 from:
./configure

to:

./configure --prefix=/opt/python3

Then after installation you could also:

# ln -s /opt/python3/bin/python3 /usr/bin/python3

It is to ensure that installation will not conflict with python installed with yum.

See explanation I have found on Internet:

http://www.hosting.com/support/linux/installing-python-3-on-centosredhat-5x-from-source

cababunga ,Feb 12, 2013 at 19:45

Why /opt ? /usr/local specifically exists for this purpose and that's where ./configure with no explicit --prefix will place it. – cababunga Feb 12 '13 at 19:45

rsc ,Feb 13, 2013 at 11:27

@cababunga As I wrote I have been influenced by reading tutorial from specified site. Nevertheless installing python in above way may be usable - it would be a lot easier to uninstall it (it looks like uninstall target for make is not provided). Also you could easily install various versions of python3 in specified separate directories under /opt and manually set which one to use or test. – rsc Feb 13 '13 at 11:27

Caleb ,Jan 8, 2015 at 21:24

You may also want to set up your PATH to contain the binaries folder. For me it was export PATH=$PATH:/opt/python3/binCaleb Jan 8 '15 at 21:24

Paul Draper ,Jan 30, 2014 at 7:52

Use the SCL repos.
sudo sh -c 'wget -qO- http://people.redhat.com/bkabrda/scl_python33.repo >> /etc/yum.repos.d/scl.repo'
sudo yum install python33
scl enable python27

(This last command will have to be run each time you want to use python27 rather than the system default.)

snacks ,Sep 24, 2014 at 13:23

After reading the redhat docs what I needed to do was either; scl enable python33 bash to launch a new shell which will be enabled for python 3 or scl enable python33 'python hello.py' which will run your python file using python 3 in the current shell – snacks Sep 24 '14 at 13:23

Nathan Basanese ,Aug 24, 2015 at 21:46

// , What more generic instructions would also allow the installation of Python 3.4? – Nathan Basanese Aug 24 '15 at 21:46

Florian La Roche ,Feb 3, 2013 at 8:53

You can download a source RPMs and binary RPMs for RHEL6 / CentOS6 from here

This is a backport from the newest Fedora development source rpm to RHEL6 / CentOS6

cababunga ,Feb 12, 2013 at 19:40

That's great. Thanks for your effort, Florian. Maybe running createrepo on those directories would make them even more useful for some people. – cababunga Feb 12 '13 at 19:40

lyomi ,Mar 21, 2014 at 15:18

What a relief. the rpm installed perfectly. – lyomi Mar 21 '14 at 15:18

Nathan Basanese ,Sep 3, 2015 at 20:45

// , How do we make a repository from that link? – Nathan Basanese Sep 3 '15 at 20:45

Nathan Basanese ,Sep 3, 2015 at 21:07

// , I can confirm that this works. Hold on, I just whipped up something quick that used that URL as the baseurl : 0bin.net/paste/Nathan Basanese Sep 3 '15 at 21:07

rkuska ,Jul 16, 2015 at 7:58

Python3 was recently added to EPEL7 as Python34.

There is ongoing (currently) effort to make packaging guidelines about how to package things for Python3 in EPEL7.

See https://bugzilla.redhat.com/show_bug.cgi?id=1219411
and https://lists.fedoraproject.org/pipermail/python-devel/2015-July/000721.html

Nathan Basanese ,Aug 24, 2015 at 21:57

// , What's the hold-up? Pip seems like the simple way to go. – Nathan Basanese Aug 24 '15 at 21:57

Mike Guerette ,Aug 27, 2015 at 13:33

Along with Python 2.7 and 3.3, Red Hat Software Collections now includes Python 3.4 - all work on both RHEL 6 and 7.

RHSCL 2.0 docs are at https://access.redhat.com/documentation/en-US/Red_Hat_Software_Collections/

Plus lot of articles at developerblog.redhat.com.

edit

Follow these instructions to install Python 3.4 on RHEL 6/7 or CentOS 6/7:
# 1. Install the Software Collections tools:
yum install scl-utils

# 2. Download a package with repository for your system.
#  (See the Yum Repositories on external link. For RHEL/CentOS 6:)
wget https://www.softwarecollections.org/en/scls/rhscl/rh-python34/epel-6-x86_64/download/rhscl-rh-python34-epel-6-x86_64.noarch.rpm
#  or for RHEL/CentOS 7
wget https://www.softwarecollections.org/en/scls/rhscl/rh-python34/epel-7-x86_64/download/rhscl-rh-python34-epel-7-x86_64.noarch.rpm

# 3. Install the repo package (on RHEL you will need to enable optional channel first):
yum install rhscl-rh-python34-*.noarch.rpm

# 4. Install the collection:
yum install rh-python34

# 5. Start using software collections:
scl enable rh-python34 bash

Nathan Basanese ,Dec 10, 2015 at 23:53

// , Doesn't this require us to enable a special shell? Combined with virtualenvs, I can see that becoming a pain in the ass. – Nathan Basanese Dec 10 '15 at 23:53

Nathan Basanese ,Dec 10, 2015 at 23:55

// , Why does this require scl enable rh-python34 bash ? What are the implications for using this later on? – Nathan Basanese Dec 10 '15 at 23:55

Searene ,Nov 20, 2016 at 2:53

Is there a way to install python3.5 on RedHat 6? I tried wget https://www.softwarecollections.org/en/scls/rhscl/rh-python3‌​5/epel-6-x86_64/down‌​load/rhscl-rh-python‌​35-epel-6-x86_64.noa‌​rch.rpm , but it was not found. – Searene Nov 20 '16 at 2:53

daneel ,Apr 2, 2015 at 14:12

If you want official RHEL packages you can use RHSCL (Red Hat Software Collections)

More details:

You have to have access to Red Hat Customer Portal to read full articles.

Nathan Basanese ,Aug 24, 2015 at 21:55

// , Just upvoted. Would you be willing to make a summary of what one does to use the RHSCL for this? This is a question and answer site, after all. – Nathan Basanese Aug 24 '15 at 21:55

amphibient ,Feb 8 at 17:12

yum install python34.x86_64 works if you have epel-release installed, which this answer explains how to, and I confirmed it worked on RHEL 7.3
$ cat /etc/*-release
NAME="Red Hat Enterprise Linux Server"
VERSION="7.3 (Maipo)

$ type python3
python3 is hashed (/usr/bin/python3)

Aty ,Feb 11 at 20:47

Here are the steps i followed to install Python3:

yum install wget

wget https://www.python.org/ftp/python/3.6.0/Python-3.6.0.tar.xz

sudo tar xvf Python-3.*

cd Python-3.*

sudo ./configure --prefix=/opt/python3

sudo make

sudo make install

sudo ln -s /opt/python3/bin/python3 /usr/bin/python3

$ /usr/bin/python3

Python 3.6.0

Nagev ,Mar 6 at 18:21

Three steps using Python 3.5 by Software Collections :
sudo yum install centos-release-scl
sudo yum install rh-python35
scl enable rh-python35 bash

Note that sudo is not needed for the last command. Now we can see that python 3 is the default for the current shell:

python --version
Python 3.5.1

Simply skip the last command if you'd rather have Python 2 as the default for the current shell.

Maxime Martineau ,May 10 at 18:02

For RHEL on Amazon Linux, using python3 I had to do :

sudo yum install python34-devel

[Dec 05, 2017] How to Install Latest Python 3.6 Version in Linux

Dec 05, 2017 | www.tecmint.com

Although we can install the core packages and their dependencies using yum and aptitude (or apt-get ), we will explain how to perform the installation from source instead.

Why? The reason is simple: this allows us to have the latest stable release of the language ( 3.6 ) and to provide a distribution-agnostic installation method.

Prior to installing Python in CentOS 7, let's make sure our system has all the necessary development dependencies:

# yum -y groupinstall development# yum -y install zlib-devel

In Debian we will need to install gcc, make, and the zlib compression / decompression library:

# aptitude -y install gcc make zlib1g-dev

To install Python 3.6 , run the following commands:

# wget https://www.python.org/ftp/python/3.6.3/Python-3.6.3.tar.xz
# tar xJf Python-3.6.3.tar.xz
# cd Python-3.6.3
# ./configure
# make
# make install

[Dec 03, 2017] Perl index function equivalent in Python

Notable quotes:
"... string.find(s, sub[, start[, end]]) Return the lowest index in s where the substring sub is found such that sub is wholly contained in s[start:end]. Return -1 on failure. Defaults for start and end and interpretation of negative values is the same as for slices. ..."
Dec 03, 2017 | stackoverflow.com

Syed Mustafa Zinoor ,Mar 4, 2015 at 15:51

The index() in Perl returns the location of a text between a start point and an endpoint. Is there something similar in Python. If not how can this be implemented.

Example : in Perl, I would write an index function to return the index of a string as follows

start = index(input_text,text_to_search,starting_point_of_search)+off_set_length

What should be the equivalent in Python?

Kasramvd ,Mar 4, 2015 at 15:54

In python you can use str.find() to find the index of a sub-string inside a string :
>>> s
'123string 1abcabcstring 2123string 3abc123stringnabc'

>>> s.find('3a')
35

string.find(s, sub[, start[, end]]) Return the lowest index in s where the substring sub is found such that sub is wholly contained in s[start:end]. Return -1 on failure. Defaults for start and end and interpretation of negative values is the same as for slices.

[Nov 23, 2017] Learning Python, 5th Edition.pdf - Google Drive

Nov 23, 2017 | drive.google.com

Learning Python, Fifth Edition

by Mark Lutz

Copyright © 2013 Mark Lutz. All rights reserved.

Printed in the United States of America.

Published by O'Reilly Media, Inc., 1005 Gravenstein Highway North, Sebastopol, CA 95472.

O'Reilly books may be purchased for educational, business, or sales promotional use. Online editions
are also available for most titles ( http://my.safaribooksonline.com ). For more information, contact our
corporate/institutional sales department: 800-998-9938 or corporate@oreilly.com

Editor: Rachel Roumeliotis Indexer: Lucie I laskins

Production Editor: Christopher 1 learse Cover Designer: Randy Comer

Copyeditor: Rachel Monaghan Interior Designer: David Futato

Proofreader: Julie Van Keuren Illustrator: Rebecca Demarcst

June 2013: Fifth Edition.

Revision History for the Fifth Edition:

2013-06-07 First release

See http://oreilly.com/catalog/errata.csp?isbn=9781449355739 for release details.

[Nov 19, 2017] Think Python by Allen B. Downey

From Amazon review: " This is a wonderfully written book. Having programmed for several decades, I was surprised by how much I enjoyed a introductory programming book. This book blends in concepts of how to solve problems while introducing python. The progression of python was done excellently with non-trivial insightful examples."
Nov 19, 2017 | greenteapress.com
  1. Preface
  2. The way of the program
  3. Variables, expressions and statements
  4. Functions
  5. Case study: interface design
  6. Conditionals and recursion
  7. Fruitful functions
  8. Iteration
  9. Strings
  10. Case study: word play
  11. Lists
  12. Dictionaries
  13. Tuples
  14. Case study: data structure selection
  15. Files
  16. Classes and objects
  17. Classes and functions
  18. Classes and methods
  19. Inheritance
  20. The Goodies
  21. Debugging
  22. Analysis of Algorithms
Amazon review:

DrewOJensen, November 29, 2013

Not just for Python beginners

Where was this book when I was taking college programming classes! I have to start off in saying that if you're a beginner in programming, this book is phenomenal. Allen explains the basics very clearly and thoroughly. I'd have to say this book is half about beginner programming and half on Python. As an FYI, this book is good for many basic principles of Python but if you're looking for anything more than just that, I'd recommend Learning Python, 5th Edition by Mark Lutz.

I bought this book for a new job that I took. I minored in CS and wish I would have had this book as my first programming book. I was attracted to it because I needed to learn Python (for work) and all of the guys use the Learning Python for reference. I figured why not start from the beginning and work my way there.

As far as the progression of the book, it moves pretty quickly. You have to stay on your toes with the examples. Having been exposed to a bit of Python before reading, I was able to keep up with the examples just in my head for a little while but as the book moved on, I was doing them in a console. I also think the flow of the book and how Allen moves from topic to topic keeps things cohesive quite well.

Overall, very well executed book and Allen assumes the reader has no experience in programming. Great book!

Update 1/20/14:

After finishing the book I wanted to write a follow up. I have to say that I stand by my initial review and rating! It has been a huge help in getting me up to speed. There are a few specific things that I would like to address.

In regards to the basic principles of Python, this book had done a very good job at balancing what you need to know vs what you can know. It was good to be reminded that this book is a beginner book. I ended up looking up more details and specifics of certain functions and methods mostly because I had specific requirements that I needed to perform with them. This can't be faulted on the author.

As I had mentioned in my first review, if you're looking for more specifics, Learning Python, 5th Edition by Mark Lutz is a great tool. I've borrowed a coworkers copy and will be getting one of my own soon.

I cannot speak on behalf of the database content since I skipped over that section and have no experience doing database/structure.

Otherwise still very good book. I enjoyed being challenged as I read the examples and I like how it wasn't just a "finish what I've shown you" type of examples, but the author said, "Ok, I showed you mostly how to do it this way, and you finished it in another example, now do it a completely different way with what we just discussed."

[Nov 16, 2017] Python coroutines

Notable quotes:
"... coroutine function ..."
"... coroutine object ..."
"... *coros_or_futures ..."
"... return_exceptions=False ..."
"... return_exceptions ..."
"... return_when=ALL_COMPLETED ..."
Nov 16, 2017 | docs.python.org

Coroutines used with asyncio may be implemented using the async def statement, or by using generators . The async def type of coroutine was added in Python 3.5, and is recommended if there is no need to support older Python versions.

Generator-based coroutines should be decorated with @asyncio.coroutine , although this is not strictly enforced. The decorator enables compatibility with async def coroutines, and also serves as documentation. Generator-based coroutines use the yield from syntax introduced in PEP 380 , instead of the original yield syntax.

The word "coroutine", like the word "generator", is used for two different (though related) concepts:

Things a coroutine can do:

Calling a coroutine does not start its code running – the coroutine object returned by the call doesn't do anything until you schedule its execution. There are two basic ways to start it running: call await coroutine or yield from coroutine from another coroutine (assuming the other coroutine is already running!), or schedule its execution using the ensure_future() function or the AbstractEventLoop.create_task() method.

Coroutines (and tasks) can only run when the event loop is running.

@asyncio. coroutine
Decorator to mark generator-based coroutines. This enables the generator use yield from to call async def coroutines, and also enables the generator to be called by async def coroutines, for instance using an await expression.

There is no need to decorate async def coroutines themselves.

If the generator is not yielded from before it is destroyed, an error message is logged. See Detect coroutines never scheduled .

Note

In this documentation, some methods are documented as coroutines, even if they are plain Python functions returning a Future . This is intentional to have a freedom of tweaking the implementation of these functions in the future. If such a function is needed to be used in a callback-style code, wrap its result with ensure_future() . 18.5.3.1.1. Example: Hello World coroutine

Example of coroutine displaying "Hello World" :

import asyncio

async def hello_world():
    print("Hello World!")

loop = asyncio.get_event_loop()
# Blocking call which returns when the hello_world() coroutine is done
loop.run_until_complete(hello_world())
loop.close()

See also

The Hello World with call_soon() example uses the AbstractEventLoop.call_soon() method to schedule a callback. 18.5.3.1.2. Example: Coroutine displaying the current date

Example of coroutine displaying the current date every second during 5 seconds using the sleep() function:

import asyncio
import datetime

async def display_date(loop):
    end_time = loop.time() + 5.0
    while True:
        print(datetime.datetime.now())
        if (loop.time() + 1.0) >= end_time:
            break
        await asyncio.sleep(1)

loop = asyncio.get_event_loop()
# Blocking call which returns when the display_date() coroutine is done
loop.run_until_complete(display_date(loop))
loop.close()

See also

The display the current date with call_later() example uses a callback with the AbstractEventLoop.call_later() method. 18.5.3.1.3. Example: Chain coroutines

Example chaining coroutines:

import asyncio

async def compute(x, y):
    print("Compute %s + %s ..." % (x, y))
    await asyncio.sleep(1.0)
    return x + y

async def print_sum(x, y):
    result = await compute(x, y)
    print("%s + %s = %s" % (x, y, result))

loop = asyncio.get_event_loop()
loop.run_until_complete(print_sum(1, 2))
loop.close()

compute() is chained to print_sum() : print_sum() coroutine waits until compute() is completed before returning its result.

Sequence diagram of the example:

../_images/tulip_coro.png

The "Task" is created by the AbstractEventLoop.run_until_complete() method when it gets a coroutine object instead of a task.

The diagram shows the control flow, it does not describe exactly how things work internally. For example, the sleep coroutine creates an internal future which uses AbstractEventLoop.call_later() to wake up the task in 1 second. 18.5.3.2. InvalidStateError

exception asyncio. InvalidStateError
The operation is not allowed in this state.
18.5.3.3. TimeoutError
exception asyncio. TimeoutError
The operation exceeded the given deadline.

Note

This exception is different from the builtin TimeoutError exception! 18.5.3.4. Future

class asyncio. Future * , loop=None
This class is almost compatible with concurrent.futures.Future .

Differences:

This class is not thread safe .

cancel ()
Cancel the future and schedule callbacks.

If the future is already done or cancelled, return False . Otherwise, change the future's state to cancelled, schedule the callbacks and return True .

cancelled ()
Return True if the future was cancelled.
done ()
Return True if the future is done.

Done means either that a result / exception are available, or that the future was cancelled.

result ()
Return the result this future represents.

If the future has been cancelled, raises CancelledError . If the future's result isn't yet available, raises InvalidStateError . If the future is done and has an exception set, this exception is raised.

exception ()
Return the exception that was set on this future.

The exception (or None if no exception was set) is returned only if the future is done. If the future has been cancelled, raises CancelledError . If the future isn't done yet, raises InvalidStateError .

add_done_callback fn
Add a callback to be run when the future becomes done.

The callback is called with a single argument - the future object. If the future is already done when this is called, the callback is scheduled with call_soon() .

Use functools.partial to pass parameters to the callback . For example, fut.add_done_callback(functools.partial(print, "Future:", flush=True)) will call print("Future:", fut, flush=True) .

remove_done_callback fn
Remove all instances of a callback from the "call when done" list.

Returns the number of callbacks removed.

set_result result
Mark the future done and set its result.

If the future is already done when this method is called, raises InvalidStateError .

set_exception exception
Mark the future done and set an exception.

If the future is already done when this method is called, raises InvalidStateError .

18.5.3.4.1. Example: Future with run_until_complete()

Example combining a Future and a coroutine function :

import asyncio

async def slow_operation(future):
    await asyncio.sleep(1)
    future.set_result('Future is done!')

loop = asyncio.get_event_loop()
future = asyncio.Future()
asyncio.ensure_future(slow_operation(future))
loop.run_until_complete(future)
print(future.result())
loop.close()

The coroutine function is responsible for the computation (which takes 1 second) and it stores the result into the future. The run_until_complete() method waits for the completion of the future.

Note

The run_until_complete() method uses internally the add_done_callback() method to be notified when the future is done. 18.5.3.4.2. Example: Future with run_forever()

The previous example can be written differently using the Future.add_done_callback() method to describe explicitly the control flow:

import asyncio

async def slow_operation(future):
    await asyncio.sleep(1)
    future.set_result('Future is done!')

def got_result(future):
    print(future.result())
    loop.stop()

loop = asyncio.get_event_loop()
future = asyncio.Future()
asyncio.ensure_future(slow_operation(future))
future.add_done_callback(got_result)
try:
    loop.run_forever()
finally:
    loop.close()

In this example, the future is used to link slow_operation() to got_result() : when slow_operation() is done, got_result() is called with the result. 18.5.3.5. Task

class asyncio. Task coro , * , loop=None
Schedule the execution of a coroutine : wrap it in a future. A task is a subclass of Future .

A task is responsible for executing a coroutine object in an event loop. If the wrapped coroutine yields from a future, the task suspends the execution of the wrapped coroutine and waits for the completion of the future. When the future is done, the execution of the wrapped coroutine restarts with the result or the exception of the future.

Event loops use cooperative scheduling: an event loop only runs one task at a time. Other tasks may run in parallel if other event loops are running in different threads. While a task waits for the completion of a future, the event loop executes a new task.

The cancellation of a task is different from the cancelation of a future. Calling cancel() will throw a CancelledError to the wrapped coroutine. cancelled() only returns True if the wrapped coroutine did not catch the CancelledError exception, or raised a CancelledError exception.

If a pending task is destroyed, the execution of its wrapped coroutine did not complete. It is probably a bug and a warning is logged: see Pending task destroyed .

Don't directly create Task instances: use the ensure_future() function or the AbstractEventLoop.create_task() method.

This class is not thread safe .

classmethod all_tasks loop=None
Return a set of all tasks for an event loop.

By default all tasks for the current event loop are returned.

classmethod current_task loop=None
Return the currently running task in an event loop or None .

By default the current task for the current event loop is returned.

None is returned when called not in the context of a Task .

cancel ()
Request that this task cancel itself.

This arranges for a CancelledError to be thrown into the wrapped coroutine on the next cycle through the event loop. The coroutine then has a chance to clean up or even deny the request using try/except/finally.

Unlike Future.cancel() , this does not guarantee that the task will be cancelled: the exception might be caught and acted upon, delaying cancellation of the task or preventing cancellation completely. The task may also return a value or raise a different exception.

Immediately after this method is called, cancelled() will not return True (unless the task was already cancelled). A task will be marked as cancelled when the wrapped coroutine terminates with a CancelledError exception (even if cancel() was not called).

get_stack * , limit=None
Return the list of stack frames for this task's coroutine.

If the coroutine is not done, this returns the stack where it is suspended. If the coroutine has completed successfully or was cancelled, this returns an empty list. If the coroutine was terminated by an exception, this returns the list of traceback frames.

The frames are always ordered from oldest to newest.

The optional limit gives the maximum number of frames to return; by default all available frames are returned. Its meaning differs depending on whether a stack or a traceback is returned: the newest frames of a stack are returned, but the oldest frames of a traceback are returned. (This matches the behavior of the traceback module.)

For reasons beyond our control, only one stack frame is returned for a suspended coroutine.

print_stack * , limit=None , file=None
Print the stack or traceback for this task's coroutine.

This produces output similar to that of the traceback module, for the frames retrieved by get_stack(). The limit argument is passed to get_stack(). The file argument is an I/O stream to which the output is written; by default output is written to sys.stderr.

18.5.3.5.1. Example: Parallel execution of tasks

Example executing 3 tasks (A, B, C) in parallel:

import asyncio

async def factorial(name, number):
    f = 1
    for i in range(2, number+1):
        print("Task %s: Compute factorial(%s)..." % (name, i))
        await asyncio.sleep(1)
        f *= i
    print("Task %s: factorial(%s) = %s" % (name, number, f))

loop = asyncio.get_event_loop()
loop.run_until_complete(asyncio.gather(
    factorial("A", 2),
    factorial("B", 3),
    factorial("C", 4),
))
loop.close()

Output:

Task A: Compute factorial(2)...
Task B: Compute factorial(2)...
Task C: Compute factorial(2)...
Task A: factorial(2) = 2
Task B: Compute factorial(3)...
Task C: Compute factorial(3)...
Task B: factorial(3) = 6
Task C: Compute factorial(4)...
Task C: factorial(4) = 24

A task is automatically scheduled for execution when it is created. The event loop stops when all tasks are done. 18.5.3.6. Task functions

Note

In the functions below, the optional loop argument allows explicitly setting the event loop object used by the underlying task or coroutine. If it's not provided, the default event loop is used.

asyncio. as_completed fs , * , loop=None , timeout=None
Return an iterator whose values, when waited for, are Future instances.

Raises asyncio.TimeoutError if the timeout occurs before all Futures are done.

Example:

for f in as_completed(fs):
    result = yield from f  # The 'yield from' may raise
    # Use result

Note

The futures are not necessarily members of fs.

asyncio. ensure_future coro_or_future , * , loop=None
Schedule the execution of a coroutine object : wrap it in a future. Return a Task object.

If the argument is a Future , it is returned directly.

New in version 3.4.4. Changed in version 3.5.1: The function accepts any awaitable object.

See also

The AbstractEventLoop.create_task() method.

asyncio. async coro_or_future , * , loop=None
A deprecated alias to ensure_future() . Deprecated since version 3.4.4.
asyncio. wrap_future future , * , loop=None
Wrap a concurrent.futures.Future object in a Future object.
asyncio. gather *coros_or_futures , loop=None , return_exceptions=False
Return a future aggregating results from the given coroutine objects or futures.

All futures must share the same event loop. If all the tasks are done successfully, the returned future's result is the list of results (in the order of the original sequence, not necessarily the order of results arrival). If return_exceptions is true, exceptions in the tasks are treated the same as successful results, and gathered in the result list; otherwise, the first raised exception will be immediately propagated to the returned future.

Cancellation: if the outer Future is cancelled, all children (that have not completed yet) are also cancelled. If any child is cancelled, this is treated as if it raised CancelledError – the outer Future is not cancelled in this case. (This is to prevent the cancellation of one child to cause other children to be cancelled.)

asyncio. iscoroutine obj
Return True if obj is a coroutine object , which may be based on a generator or an async def coroutine.
asyncio. iscoroutinefunction func
Return True if func is determined to be a coroutine function , which may be a decorated generator function or an async def function.
asyncio. run_coroutine_threadsafe coro , loop
Submit a coroutine object to a given event loop.

Return a concurrent.futures.Future to access the result.

This function is meant to be called from a different thread than the one where the event loop is running. Usage:

# Create a coroutine
coro = asyncio.sleep(1, result=3)
# Submit the coroutine to a given loop
future = asyncio.run_coroutine_threadsafe(coro, loop)
# Wait for the result with an optional timeout argument
assert future.result(timeout) == 3

If an exception is raised in the coroutine, the returned future will be notified. It can also be used to cancel the task in the event loop:

try:
    result = future.result(timeout)
except asyncio.TimeoutError:
    print('The coroutine took too long, cancelling the task...')
    future.cancel()
except Exception as exc:
    print('The coroutine raised an exception: {!r}'.format(exc))
else:
    print('The coroutine returned: {!r}'.format(result))

See the concurrency and multithreading section of the documentation.

Note

Unlike other functions from the module, run_coroutine_threadsafe() requires the loop argument to be passed explicitly. New in version 3.5.1.

coroutine asyncio. sleep delay , result=None , * , loop=None
Create a coroutine that completes after a given time (in seconds). If result is provided, it is produced to the caller when the coroutine completes.

The resolution of the sleep depends on the granularity of the event loop .

This function is a coroutine .

asyncio. shield arg , * , loop=None
Wait for a future, shielding it from cancellation.

The statement:

res = yield from shield(something())

is exactly equivalent to the statement:

res = yield from something()

except that if the coroutine containing it is cancelled, the task running in something() is not cancelled. From the point of view of something() , the cancellation did not happen. But its caller is still cancelled, so the yield-from expression still raises CancelledError . Note: If something() is cancelled by other means this will still cancel shield() .

If you want to completely ignore cancellation (not recommended) you can combine shield() with a try/except clause, as follows:

try:
    res = yield from shield(something())
except CancelledError:
    res = None
coroutine asyncio. wait futures , * , loop=None , timeout=None , return_when=ALL_COMPLETED
Wait for the Futures and coroutine objects given by the sequence futures to complete. Coroutines will be wrapped in Tasks. Returns two sets of Future : (done, pending).

The sequence futures must not be empty.

timeout can be used to control the maximum number of seconds to wait before returning. timeout can be an int or float. If timeout is not specified or None , there is no limit to the wait time.

return_when indicates when this function should return. It must be one of the following constants of the concurrent.futures module:

Constant Description
FIRST_COMPLETED The function will return when any future finishes or is cancelled.
FIRST_EXCEPTION The function will return when any future finishes by raising an exception. If no future raises an exception then it is equivalent to ALL_COMPLETED .
ALL_COMPLETED The function will return when all futures finish or are cancelled.

This function is a coroutine .

Usage:

done, pending = yield from asyncio.wait(fs)

Note

This does not raise asyncio.TimeoutError ! Futures that aren't done when the timeout occurs are returned in the second set.

coroutine asyncio. wait_for fut , timeout , * , loop=None
Wait for the single Future or coroutine object to complete with timeout. If timeout is None , block until the future completes.

Coroutine will be wrapped in Task .

Returns result of the Future or coroutine. When a timeout occurs, it cancels the task and raises asyncio.TimeoutError . To avoid the task cancellation, wrap it in shield() .

If the wait is cancelled, the future fut is also cancelled.

This function is a coroutine , usage:

result = yield from asyncio.wait_for(fut, 60.0)
Changed in version 3.4.3: If the wait is cancelled, the future fut is now also cancelled.

[Nov 16, 2017] Effective Python Item 40 Consider Coroutines to Run Many Functions Concurrently

Nov 16, 2017 | www.informit.com

Threads give Python programmers a way to run multiple functions seemingly at the same time (see Item 37: "Use Threads for Blocking I/O, Avoid for Parallelism"). But there are three big problems with threads:

Python can work around all these issues with coroutines . Coroutines let you have many seemingly simultaneous functions in your Python programs. They're implemented as an extension to generators. The cost of starting a generator coroutine is a function call. Once active, they each use less than 1 KB of memory until they're exhausted.

Coroutines work by enabling the code consuming a generator to send a value back into the generator function after each yield expression. The generator function receives the value passed to the send function as the result of the corresponding yield expression.

def my_coroutine():
    while True:
        received = yield
        print('Received:', received)

it = my_coroutine()
next(it)             # Prime the coroutine
it.send('First')
it.send('Second')

>>>
Received: First
Received: Second

The initial call to next is required to prepare the generator for receiving the first send by advancing it to the first yield expression. Together, yield and send provide generators with a standard way to vary their next yielded value in response to external input.

For example, say you want to implement a generator coroutine that yields the minimum value it's been sent so far. Here, the bare yield prepares the coroutine with the initial minimum value sent in from the outside. Then the generator repeatedly yields the new minimum in exchange for the next value to consider.

def minimize():
    current = yield
    while True:
        value = yield current
        current = min(value, current)

The code consuming the generator can run one step at a time and will output the minimum value seen after each input.

it = minimize()
next(it)            # Prime the generator
print(it.send(10))
print(it.send(4))
print(it.send(22))
print(it.send(-1))

>>>
10
4
4
-1

The generator function will seemingly run forever, making forward progress with each new call to send . Like threads, coroutines are independent functions that can consume inputs from their environment and produce resulting outputs. The difference is that coroutines pause at each yield expression in the generator function and resume after each call to send from the outside. This is the magical mechanism of coroutines.

This behavior allows the code consuming the generator to take action after each yield expression in the coroutine. The consuming code can use the generator's output values to call other functions and update data structures. Most importantly, it can advance other generator functions until their next yield expressions. By advancing many separate generators in lockstep, they will all seem to be running simultaneously, mimicking the concurrent behavior of Python threads.

The Game of Life

Let me demonstrate the simultaneous behavior of coroutines with an example. Say you want to use coroutines to implement Conway's Game of Life. The rules of the game are simple. You have a two-dimensional grid of an arbitrary size. Each cell in the grid can either be alive or empty.

ALIVE = '*'
EMPTY = '-'

The game progresses one tick of the clock at a time. At each tick, each cell counts how many of its neighboring eight cells are still alive. Based on its neighbor count, each cell decides if it will keep living, die, or regenerate. Here's an example of a 5×5 Game of Life grid after four generations with time going to the right. I'll explain the specific rules further below.

  0   |   1   |   2   |   3   |   4
----- | ----- | ----- | ----- | -----
-*--- | --*-- | --**- | --*-- | -----
--**- | --**- | -*--- | -*--- | -**--
---*- | --**- | --**- | --*-- | -----
----- | ----- | ----- | ----- | -----

I can model this game by representing each cell as a generator coroutine running in lockstep with all the others.

To implement this, first I need a way to retrieve the status of neighboring cells. I can do this with a coroutine named count_neighbors that works by yielding Query objects. The Query class I define myself. Its purpose is to provide the generator coroutine with a way to ask its surrounding environment for information.

Query = namedtuple('Query', ('y', 'x'))

The coroutine yields a Query for each neighbor. The result of each yield expression will be the value ALIVE or EMPTY . That's the interface contract I've defined between the coroutine and its consuming code. The count_neighbors generator sees the neighbors' states and returns the count of living neighbors.

def count_neighbors(y, x):
    n_ = yield Query(y + 1, x + 0)  # North
    ne = yield Query(y + 1, x + 1)  # Northeast
    # Define e_, se, s_, sw, w_, nw ...
    # ...
    neighbor_states = [n_, ne, e_, se, s_, sw, w_, nw]
    count = 0
    for state in neighbor_states:
        if state == ALIVE:
            count += 1
    return count

I can drive the count_neighbors coroutine with fake data to test it. Here, I show how Query objects will be yielded for each neighbor. count_neighbors expects to receive cell states corresponding to each Query through the coroutine's send method. The final count is returned in the StopIteration exception that is raised when the generator is exhausted by the return statement.

it = count_neighbors(10, 5)
q1 = next(it)                  # Get the first query
print('First yield: ', q1)
q2 = it.send(ALIVE)            # Send q1 state, get q2
print('Second yield:', q2)
q3 = it.send(ALIVE)            # Send q2 state, get q3
# ...
try:
    count = it.send(EMPTY)     # Send q8 state, retrieve count
except StopIteration as e:
    print('Count: ', e.value)  # Value from return statement
>>>
First yield:  Query(y=11, x=5)
Second yield: Query(y=11, x=6)
...
Count:  2

Now I need the ability to indicate that a cell will transition to a new state in response to the neighbor count that it found from count_neighbors . To do this, I define another coroutine called step_cell . This generator will indicate transitions in a cell's state by yielding Transition objects. This is another class that I define, just like the Query class.

Transition = namedtuple('Transition', ('y', 'x', 'state'))

The step_cell coroutine receives its coordinates in the grid as arguments. It yields a Query to get the initial state of those coordinates. It runs count_neighbors to inspect the cells around it. It runs the game logic to determine what state the cell should have for the next clock tick. Finally, it yields a Transition object to tell the environment the cell's next state.

def game_logic(state, neighbors):
    # ...

def step_cell(y, x):
    state = yield Query(y, x)
    neighbors = yield from count_neighbors(y, x)
    next_state = game_logic(state, neighbors)
    yield Transition(y, x, next_state)

Importantly, the call to count_neighbors uses the yield from expression. This expression allows Python to compose generator coroutines together, making it easy to reuse smaller pieces of functionality and build complex coroutines from simpler ones. When count_neighbors is exhausted, the final value it returns (with the return statement) will be passed to step_cell as the result of the yield from expression.

Now, I can finally define the simple game logic for Conway's Game of Life. There are only three rules.

def game_logic(state, neighbors):
    if state == ALIVE:
        if neighbors < 2:
            return EMPTY     # Die: Too few
        elif neighbors > 3:
            return EMPTY     # Die: Too many
    else:
        if neighbors == 3:
            return ALIVE     # Regenerate
    return state

I can drive the step_cell coroutine with fake data to test it.

it = step_cell(10, 5)
q0 = next(it)           # Initial location query
print('Me:      ', q0)
q1 = it.send(ALIVE)     # Send my status, get neighbor query
print('Q1:      ', q1)
# ...
t1 = it.send(EMPTY)     # Send for q8, get game decision
print('Outcome: ', t1)

>>>
Me:       Query(y=10, x=5)
Q1:       Query(y=11, x=5)
...
Outcome:  Transition(y=10, x=5, state='-')

The goal of the game is to run this logic for a whole grid of cells in lockstep. To do this, I can further compose the step_cell coroutine into a simulate coroutine. This coroutine progresses the grid of cells forward by yielding from step_cell many times. After progressing every coordinate, it yields a TICK object to indicate that the current generation of cells have all transitioned.

TICK = object()

def simulate(height, width):
    while True:
        for y in range(height):
            for x in range(width):
                yield from step_cell(y, x)
        yield TICK

What's impressive about simulate is that it's completely disconnected from the surrounding environment. I still haven't defined how the grid is represented in Python objects, how Query , Transition , and TICK values are handled on the outside, nor how the game gets its initial state. But the logic is clear. Each cell will transition by running step_cell . Then the game clock will tick. This will continue forever, as long as the simulate coroutine is advanced.

This is the beauty of coroutines. They help you focus on the logic of what you're trying to accomplish. They decouple your code's instructions for the environment from the implementation that carries out your wishes. This enables you to run coroutines seemingly in parallel. This also allows you to improve the implementation of following those instructions over time without changing the coroutines.

Now, I want to run simulate in a real environment. To do that, I need to represent the state of each cell in the grid. Here, I define a class to contain the grid:

class Grid(object):
    def __init__(self, height, width):
        self.height = height
        self.width = width
        self.rows = []
        for _ in range(self.height):
            self.rows.append([EMPTY] * self.width)

    def __str__(self):
        # ...

The grid allows you to get and set the value of any coordinate. Coordinates that are out of bounds will wrap around, making the grid act like infinite looping space.

    def query(self, y, x):
        return self.rows[y % self.height][x % self.width]

    def assign(self, y, x, state):
        self.rows[y % self.height][x % self.width] = state

At last, I can define the function that interprets the values yielded from simulate and all of its interior coroutines. This function turns the instructions from the coroutines into interactions with the surrounding environment. It progresses the whole grid of cells forward a single step and then returns a new grid containing the next state.

def live_a_generation(grid, sim):
    progeny = Grid(grid.height, grid.width)
    item = next(sim)
    while item is not TICK:
        if isinstance(item, Query):
            state = grid.query(item.y, item.x)
            item = sim.send(state)
        else:  # Must be a Transition
            progeny.assign(item.y, item.x, item.state)
            item = next(sim)
    return progeny

To see this function in action, I need to create a grid and set its initial state. Here, I make a classic shape called a glider.

grid = Grid(5, 9)
grid.assign(0, 3, ALIVE)
# ...
print(grid)

>>>
---*-----
----*----
--***----
---------
---------

Now I can progress this grid forward one generation at a time. You can see how the glider moves down and to the right on the grid based on the simple rules from the game_logic function.

class ColumnPrinter(object):
    # ...

columns = ColumnPrinter()
sim = simulate(grid.height, grid.width)
for i in range(5):
    columns.append(str(grid))
    grid = live_a_generation(grid, sim)

print(columns)

>>>
    0     |     1     |     2     |     3     |     4
---*----- | --------- | --------- | --------- | ---------
----*---- | --*-*---- | ----*---- | ---*----- | ----*----
--***---- | ---**---- | --*-*---- | ----**--- | -----*---
--------- | ---*----- | ---**---- | ---**---- | ---***---
--------- | --------- | --------- | --------- | ---------

The best part about this approach is that I can change the game_logic function without having to update the code that surrounds it. I can change the rules or add larger spheres of influence with the existing mechanics of Query , Transition , and TICK . This demonstrates how coroutines enable the separation of concerns, which is an important design principle.

Coroutines in Python 2

Unfortunately, Python 2 is missing some of the syntactical sugar that makes coroutines so elegant in Python 3. There are two limitations. First, there is no yield from expression. That means that when you want to compose generator coroutines in Python 2, you need to include an additional loop at the delegation point.

# Python 2
def delegated():
    yield 1
    yield 2

def composed():
    yield 'A'
    for value in delegated():  # yield from in Python 3
        yield value
    yield 'B'

print list(composed())

>>>
['A', 1, 2, 'B']

The second limitation is that there is no support for the return statement in Python 2 generators. To get the same behavior that interacts correctly with try / except / finally blocks, you need to define your own exception type and raise it when you want to return a value.

# Python 2
class MyReturn(Exception):
    def __init__(self, value):
        self.value = value

def delegated():
    yield 1
    raise MyReturn(2)  # return 2 in Python 3
    yield 'Not reached'

def composed():
    try:
        for value in delegated():
            yield value
    except MyReturn as e:
        output = e.value
    yield output * 4

print list(composed())

>>>
[1, 8]
Things to Remember

[Nov 16, 2017] Python generators and coroutines - Stack Overflow

Notable quotes:
"... Edit: I recommend using Greenlet . But if you're interested in a pure Python approach, read on. ..."
"... at a language level ..."
"... To anyone reading this in 2015 or later, the new syntax is 'yield from' ( PEP 380 ) and it allows true coroutines in Python >3.3 ..."
Nov 16, 2017 | stackoverflow.com

Python generators and coroutines Ask Question up vote down vote favorite 6

Giuseppe Maggiore ,May 10, 2011 at 10:25

I am studying coroutines and generators in various programming languages.

I was wondering if there is a cleaner way to combine together two coroutines implemented via generators than yielding back at the caller whatever the callee yields?

Let's say that we are using the following convention: all yields apart from the last one return null, while the last one returns the result of the coroutine. So, for example, we could have a coroutine that invokes another:

def A():
  # yield until a certain condition is met
  yield result

def B():
  # do something that may or may not yield
  x = bind(A())
  # ...
  return result

in this case I wish that through bind (which may or may not be implementable, that's the question) the coroutine B yields whenever A yields until A returns its final result, which is then assigned to x allowing B to continue.

I suspect that the actual code should explicitly iterate A so:

def B():
  # do something that may or may not yield
  for x in A(): ()
  # ...
  return result

which is a tad ugly and error prone...

PS: it's for a game where the users of the language will be the designers who write scripts (script = coroutine). Each character has an associated script, and there are many sub-scripts which are invoked by the main script; consider that, for example, run_ship invokes many times reach_closest_enemy, fight_with_closest_enemy, flee_to_allies, and so on. All these sub-scripts need to be invoked the way you describe above; for a developer this is not a problem, but for a designer the less code they have to write the better!

S.Lott ,May 10, 2011 at 10:38

This is all covered on the Python web site. python.org/dev/peps/pep-0342 , python.org/dev/peps/pep-0334 and numerous blogs cover this. eecho.info/Echo/python/coroutines-python . Please Google, read, and then ask specific questions based on what you've read. – S.Lott May 10 '11 at 10:38

S.Lott ,May 10, 2011 at 13:04

I thought the examples clearly demonstrated idiomatic. Since I'm unable to understand what's wrong with the examples, could you state which examples you found to be unclear? Which examples were confusing? Can you be more specific on how all those examples where not able to show idiomatic Python? – S.Lott May 10 '11 at 13:04

Giuseppe Maggiore ,May 10, 2011 at 13:09

I've read precisely those articles, and the PEP-342 leaves me somewhat confused: is it some actual extension that is currently working in Python? Is the Trampoline class shown there part of the standard libraries of the language? BTW, my question was very precise, and it was about the IDIOMATIC way to pass control around coroutines. The fact that I can read about a ton of ways to do so really does not help. Neither does your snarkiness... – Giuseppe Maggiore May 10 '11 at 13:09

Giuseppe Maggiore ,May 10, 2011 at 13:11

Idiomatic is about the "standard" way to perform some function; there is absolutely nothing wrong with iterating the results of a nested coroutine, but there are examples in the literature of programming languages where yielding automatically climbs down the call stack and so you do not need to re-yield at each caller, hence my curiosity if this pattern is covered by sintactic sugar in Python or not! – Giuseppe Maggiore May 10 '11 at 13:11

S.Lott ,May 10, 2011 at 13:19

@Giuseppe Maggiore: "programming languages where yielding automatically climbs down the call stack" That doesn't sound like the same question. Are you asking for idiomatic Python -- as shown by numerous examples -- or are you asking for some other feature that's not shown in the Python examples but is shown in other languages? I'm afraid that I can't understand your question at all. Can you please clarify what you're really looking for? – S.Lott May 10 '11 at 13:19

blubb ,May 10, 2011 at 10:37

Are you looking for something like this?
def B():
   for x in A():
     if x is None:
       yield
     else:
       break

   # continue, x contains value A yielded

Giuseppe Maggiore ,May 10, 2011 at 12:59

Yes, that is what I am doing. My question is if this is the idiomatic way or if there is some syntactic construct that is capable of hiding this pattern which recurs very often in my application. – Giuseppe Maggiore May 10 '11 at 12:59

blubb ,May 10, 2011 at 13:31

@Guiseppe Maggiore: I'm not aware of any such constructs. However, it seems strange that you need this pattern often... I can't think of many valid used cases off the top of my head. If you give more context information, maybe we can propose an alternative solution which is more elegant over all? – blubb May 10 '11 at 13:31

Giuseppe Maggiore ,May 10, 2011 at 15:17

It's for a game where the users of the language will be the designers who write scripts (script = coroutine). Each character has an associated script, and there are many sub-scripts which are invoked by the main script; consider that, for example, run_ship invokes many times reach_closest_enemy, fight_with_closest_enemy, flee_to_allies, and so on. All these sub-scripts need to be invoked the way you describe above; for a developer this is not a problem, but for a designer the less code they have to write the better! – Giuseppe Maggiore May 10 '11 at 15:17

blubb ,May 10, 2011 at 15:57

@Guiseppe Maggiore: I'd propose you add that last comment to the question so that other get a chance of answering it, too... – blubb May 10 '11 at 15:57

Simon Radford ,Nov 11, 2011 at 0:24

Edit: I recommend using Greenlet . But if you're interested in a pure Python approach, read on.

This is addressed in PEP 342 , but it's somewhat tough to understand at first. I'll try to explain simply how it works.

First, let me sum up what I think is the problem you're really trying to solve.

Problem

You have a callstack of generator functions calling other generator functions. What you really want is to be able to yield from the generator at the top, and have the yield propagate all the way down the stack.

The problem is that Python does not ( at a language level ) support real coroutines, only generators. (But, they can be implemented.) Real coroutines allow you to halt an entire stack of function calls and switch to a different stack. Generators only allow you to halt a single function. If a generator f() wants to yield, the yield statement has to be in f(), not in another function that f() calls.

The solution that I think you're using now, is to do something like in Simon Stelling's answer (i.e. have f() call g() by yielding all of g()'s results). This is very verbose and ugly, and you're looking for syntax sugar to wrap up that pattern. Note that this essentially unwinds the stack every time you yield, and then winds it back up again afterwards.

Solution

There is a better way to solve this problem. You basically implement coroutines by running your generators on top of a "trampoline" system.

To make this work, you need to follow a couple patterns: 1. When you want to call another coroutine, yield it. 2. Instead of returning a value, yield it.

so

def f():
    result = g()
    #  
    return return_value

becomes

def f():
    result = yield g()
    #  
    yield return_value

Say you're in f(). The trampoline system called f(). When you yield a generator (say g()), the trampoline system calls g() on your behalf. Then when g() has finished yielding values, the trampoline system restarts f(). This means that you're not actually using the Python stack; the trampoline system manages a callstack instead.

When you yield something other than a generator, the trampoline system treats it as a return value. It passes that value back to the caller generator through the yield statement (using .send() method of generators).

Comments

This kind of system is extremely important and useful in asynchronous applications, like those using Tornado or Twisted. You can halt an entire callstack when it's blocked, go do something else, and then come back and continue execution of the first callstack where it left off.

The drawback of the above solution is that it requires you to write essentially all your functions as generators. It may be better to use an implementation of true coroutines for Python - see below.

Alternatives

There are several implementations of coroutines for Python, see: http://en.wikipedia.org/wiki/Coroutine#Implementations_for_Python

Greenlet is an excellent choice. It is a Python module that modifies the CPython interpreter to allow true coroutines by swapping out the callstack.

Python 3.3 should provide syntax for delegating to a subgenerator, see PEP 380 .

gaborous ,Nov 9, 2012 at 10:04

Very useful and clear answer, thank's! However, when you say that standard Python coroutines essentially require to write all functions as generators, did you mean only first level functions or really all functions? As you said above, when yielding something other than a generator, the trampoline system still works, so theoretically we can just yield at the first-layer functions any other functions that may or may not be generators themselves. Am I right? – gaborous Nov 9 '12 at 10:04

Simon Radford ,Nov 21, 2012 at 21:37

All "functions" between the trampoline system and a yield must be written as generators. You can call regular functions normally, but then you can't effectively "yield" from that function or any functions it calls. Does that make sense / answer your question? – Simon Radford Nov 21 '12 at 21:37

Simon Radford ,Nov 21, 2012 at 21:39

I highly recommend using Greenlet - it's a true implementation of coroutines for Python, and you don't have to use any of these patterns I've described. The trampoline stuff is for people who are interested in how you can do it in pure Python. – Simon Radford Nov 21 '12 at 21:39

Nick Sweeting ,Jun 7, 2015 at 22:12

To anyone reading this in 2015 or later, the new syntax is 'yield from' ( PEP 380 ) and it allows true coroutines in Python >3.3 . – Nick Sweeting Jun 7 '15 at 22:12

[Nov 14, 2017] Masterminds of Programming Conversations with the Creators of Major Programming Languages

Notable quotes:
"... What differences are there between developing a programming language and developing a "common" software project? ..."
"... How do you debug a language? ..."
"... How do you decide when a feature should go in a library as an extension or when it needs to have support from the core language? ..."
"... I suppose there are probably features that you've looked at that you couldn't implement in Python other than by changing the language, but you probably rejected them. What criteria do you use to say this is something that's Pythonic, this is something that's not Pythonic? ..."
"... You have the "Zen of Python," but beyond that? ..."
"... Sounds almost like it's a matter of taste as much as anything ..."
"... There's an argument to make for parsimony there, but very much in the context of personal taste ..."
"... How did the Python Enhancement Proposal (PEP) process come about? ..."
"... Do you find that adding a little bit of formalism really helps crystallize the design decisions around Python enhancements? ..."
"... Do they lead to a consensus where someone can ask you to weigh in on a single particular crystallized set of expectations and proposals? ..."
"... What creates the need for a new major version? ..."
"... How did you choose to handle numbers as arbitrary precision integers (with all the cool advantages you get) instead of the old (and super common) approach to pass it to the hardware? ..."
"... Why do you call it a radical step? ..."
"... How did you adopt the "there should be one -- and preferably only one -- obvious way to do it" philosophy? ..."
"... What is your take on static versus dynamic typing? ..."
"... Are we moving toward hybrid typing? ..."
"... Why did you choose to support multiple paradigms? ..."
"... When you created the language, did you consider the type of programmers it might have attracted? ..."
"... How do you balance the different needs of a language that should be easy to learn for novices versus a language that should be powerful enough for experienced programmers to do useful things? Is that a false dichotomy? ..."
Nov 14, 2017 | www.amazon.com

The Pythonic Way

What differences are there between developing a programming language and developing a "common" software project?

Guido van Rossum : More than with most software projects, your most important users are programmers themselves. This gives a language project a high level of "meta" content. In the dependency tree of software projects, programming

How do you debug a language?

Guido : You don't. Language design is one area where agile development methodologies just don't make sense -- until the language is stable, few people want to use it, and you won't find the bugs in the language definition until you have so many users that it's too late to change things.

Of course there's plenty in the implementation that can be debugged like any old program, but the language design itself pretty much requires careful design up front, because the cost of bugs is so exorbitant.

How do you decide when a feature should go in a library as an extension or when it needs to have support from the core language?

Guido : Historically, I've had a pretty good answer for that. One thing I noticed very early on was that everybody wants their favorite feature added to the language, and most people are relatively inexperienced about language design. Everybody is always proposing "let's add this to the language," "let's have a statement that does X." In many cases, the answer is, "Well, you can already do X or something almost like X by writing these two or three lines of code, and it's not all that difficult." You can use a dictionary, or you can combine a list and a tuple and a regular expression, or write a little metaclass -- all of those things. I may even have had the original version of this answer from Linus, who seems to have a similar philosophy.

Telling people you can already do that and here is how is a first line of defense. The second thing is, "Well, that's a useful thing and we can probably write or you can probably write your own module or class, and encapsulate that particular bit of abstraction." Then the next line of defense is, "OK, this looks so interesting and useful that we'll actually accept it as a new addition to the standard library, and it's going to be pure Python." And then, finally, there are things that just aren't easy to do in pure Python and we'll suggest or recommend how to turn them into a C extension. The C extensions are the last line of defense before we have to admit, "Well, yeah, this is so useful and you really cannot do this, so we'll have to change the language."

There are other criteria that determine whether it makes more sense to add something to the language or it makes more sense to add something to the library, because if it has to do with the semantics of namespaces or that kind of stuff, there's really nothing you can do besides changing the language. On the other hand, the extension mechanism was made powerful enough that there is an amazing amount of stuff you can do from C code that extends the library and possibly even adds new built-in functionality without actually changing the language. The parser doesn't change. The parse tree doesn't change. The documentation for the language doesn't change. All your tools still work, and yet you have added new functionality to your system.

I suppose there are probably features that you've looked at that you couldn't implement in Python other than by changing the language, but you probably rejected them. What criteria do you use to say this is something that's Pythonic, this is something that's not Pythonic?

Guido : That's much harder. That is probably, in many cases, more a matter of a gut feeling than anything else. People use the word Pythonic and "that is Pythonic" a lot, but nobody can give you a watertight definition of what it means for something to be Pythonic or un-Pythonic.

You have the "Zen of Python," but beyond that?

Guido : That requires a lot of interpretation, like every good holy book. When I see a good or a bad proposal, I can tell if it is a good or bad proposal, but it's really hard to write a set of rules that will help someone else to distinguish good language change proposals from bad change proposals.

Sounds almost like it's a matter of taste as much as anything

Guido : Well, the first thing is always try to say "no," and see if they go away or find a way to get their itch scratched without changing the language. It's remarkable how often that works. That's more of a operational definition of "it's not necessary to change the language."

If you keep the language constant, people will still find a way to do what they need to do. Beyond that it's often a matter of use cases coming from different areas where there is nothing application-specific. If something was really cool for the Web, that would not make it a good feature to add to the language. If something was really good for writing shorter functions or writing classes that are more maintainable, that might be a good thing to add to the language. It really needs to transcend application domains in general, and make things simpler or more elegant.

When you change the language, you affect everyone. There's no feature that you can hide so well that most people don't need to know about. Sooner or later, people will encounter code written by someone else that uses it, or they'll encounter some obscure corner case where they have to learn about it because things don't work the way they expected.

Often elegance is also in the eye of the beholder. We had a recent discussion on one of the Python lists where people were arguing forcefully that using dollar instead of self-dot was much more elegant. I think their definition of elegance was number of keystrokes.

There's an argument to make for parsimony there, but very much in the context of personal taste

Guido : Elegance and simplicity and generality all are things that, to a large extent, depend on personal taste, because what seems to cover a larger area for me may not cover enough for someone else, and vice versa.

How did the Python Enhancement Proposal (PEP) process come about?

Guido : That's a very interesting historical tidbit. I think it was mostly started and championed by Barry Warsaw, one of the core developers. He and I started working together in '95, and I think around 2000, he came up with the suggestion that we needed more of a formal process around language changes.

I tend to be slow in these things. I mean I wasn't the person who discovered that we really needed a mailing list. I wasn't the person who discovered that the mailing list got unwieldy and we needed a newsgroup. I wasn't the person to propose that we needed a website. I was also not the person to propose that we needed a process for discussing and inventing language changes, and making sure to avoid the occasional mistake where things had been proposed and quickly accepted without thinking through all of the consequences.

At the time between 1995 and 2000, Barry, myself, and a few other core developers, Fred Drake, Ken Manheimer for a while, were all at CNRI, and one of the things that CNRI did was organize the IETF meetings. CNRI had this little branch that eventually split off that was a conference organizing bureau, and their only customer was the IETF. They later also did the Python conferences for a while, actually. Because of that it was a pretty easy boondoggle to attend IETF meetings even if they weren't local. I certainly got a taste of the IETF process with its RFCs and its meeting groups and stages, and Barry also got a taste of that. When he proposed to do something similar for Python, that was an easy argument to make. We consciously decided that we wouldn't make it quite as heavy-handed as the IETF RFCs had become by then, because Internet standards, at least some of them, affect way more industries and people and software than a Python change, but we definitely modeled it after that. Barry is a genius at coming up with good names, so I am pretty sure that PEP was his idea.

We were one of the first open source projects at the time to have something like this, and it's been relatively widely copied. The Tcl/Tk community basically changed the title and used exactly the same defining document and process, and other projects have done similar things.

Do you find that adding a little bit of formalism really helps crystallize the design decisions around Python enhancements?

Guido : I think it became necessary as the community grew and I wasn't necessarily able to judge every proposal on its value by itself. It has really been helpful for me to let other people argue over various details, and then come with relatively clear-cut conclusions.

Do they lead to a consensus where someone can ask you to weigh in on a single particular crystallized set of expectations and proposals?

Guido : Yes. It often works in a way where I initially give a PEP a thumb's up in the sense that I say, "It looks like we have a problem here. Let's see if someone figures out what the right solution is." Often they come out with a bunch of clear conclusions on how the problem should be solved and also a bunch of open issues. Sometimes my gut feelings can help close the open issues. I'm very active in the PEP process when it's an area that I'm excited about -- if we had to add a

What creates the need for a new major version?

Guido : It depends on your definition of major. In Python, we generally consider releases like 2.4, 2.5, and 2.6 "major" events, which only happen every 18–24 months. These are the only occasions where we can introduce new features. Long ago, releases were done at the whim of the developers (me, in particular). Early this decade, however, the users requested some predictability -- they objected against features being added or changed in "minor" revisions (e.g., 1.5.2 added major features compared to 1.5.1), and they wished the major releases to be supported for a certain minimum amount of time (18 months). So now we have more or less time-based major releases: we plan the series of dates leading up to a major release (e.g., when alpha and beta versions and release candidates are issued) long in advance, based on things like release manager availability, and we urge the developers to get their changes in well in advance of the final release date.

Features selected for addition to releases are generally agreed upon by the core developers, after (sometimes long) discussions on the merits of the feature and its precise specification. This is the PEP process: Python Enhancement Proposal, a document-base process not unlike the IETF's RFC process or the Java world's JSR process, except that we aren't quite as formal, as we have a much smaller community of developers. In case of prolonged disagreement (either on the merits of a feature or on specific details), I may end up breaking a tie; my tie-breaking algorithm is mostly intuitive, since by the time it is invoked, rational argument has long gone out of the window.

The most contentious discussions are typically about user-visible language features; library additions are usually easy (as they don't harm users who don't care), and internal improvements are not really considered features, although they are constrained by pretty stringent backward compatibility at the C API level.

Since the developers are typically the most vocal users, I can't really tell whether

There's also the concept of a radically major or breakthrough version, like 3.0. Historically, 1.0 was evolutionarily close to 0.9, and 2.0 was also a relatively small step from 1.6. From now on, with the much larger user base, such versions are rare indeed, and provide the only occasion for being truly incompatible with previous versions. Major versions are made backward compatible with previous major versions with a specific mechanism available for deprecating features slated for removal.

How did you choose to handle numbers as arbitrary precision integers (with all the cool advantages you get) instead of the old (and super common) approach to pass it to the hardware?

Guido : I originally inherited this idea from Python's predecessor, ABC. ABC used arbitrary precision rationals, but I didn't like the rationals that much, so I switched to integers; for reals, Python uses the standard floating-point representation supported by the hardware (and so did ABC, with some prodding).

Originally Python had two types of integers: the customary 32-bit variety ("int") and a separate arbitrary precision variety ("long"). Many languages do this, but the arbitrary precision variety is relegated to a library, like Bignum in Java and Perl, or GNU MP for C.

Previously, this would raise an OverflowError exception. There was once a time where the result would silently be truncated, but I changed it to raising an exception before ever letting others use the language. In early 1990, I wasted an afternoon debugging a short demo program I'd written implementing an algorithm that made non-obvious use of very large integers. Such debugging sessions are seminal experiences.

However, there were still certain cases where the two number types behaved slightly different; for example, printing an int in hexadecimal or octal format would produce an unsigned outcome (e.g., –1 would be printed as FFFFFFFF), while doing the same on the mathematically equal long would produce a signed outcome (–1, in this case). In Python 3.0, we're taking the radical step of supporting only a single integer type; we're calling it int , but the implementation is largely that of the old long type.

Why do you call it a radical step?

Guido : Mostly because it's a big deviation from current practice in Python. There was a lot of discussion about this, and people proposed various alternatives where two (or more) representations would be used internally, but completely or mostly hidden from end users (but not from C extension writers). That might perform a bit better, but in the end it was already a massive amount of work, and having two representations internally would just increase the effort of getting it right, and make interfacing to it from C

How did you adopt the "there should be one -- and preferably only one -- obvious way to do it" philosophy?

Guido : This was probably subconscious at first. When Tim definition of having one way (or one true way) to express something. For example, the XYZ coordinates of any point in 3D space are uniquely determined, once you've picked an origin and three basis vectors.

I also like to think that I'm doing most users a favor by not requiring them to choose between similar alternatives. You can contrast this with Java, where if you need a listlike data structure, the standard library offers many versions (a linked list, or an array list, and others), or C, where you have to decide how to implement your own list data type.

What is your take on static versus dynamic typing?

Guido : I wish I could say something simple like "

In some situations the verbosity of Java is considered a plus; it has enabled the creation of powerful code-browsing tools that can answer questions like "where is this variable changed?" or "who calls this method?" Dynamic languages make answering such questions harder, because it's often hard to find out the type of a method argument without analyzing every path through the entire codebase. I'm not sure how functional languages like Haskell support such tools; it could well be that you'd have to use essentially the same technique as for dynamic languages, since that's what type inferencing does anyway -- in my limited understanding!

Are we moving toward hybrid typing?

Guido : I expect there's a lot to say for some kind of hybrid. I've noticed that most large systems written in a statically typed language actually contain a significant subset that is essentially dynamically typed. For example, GUI widget sets and database APIs for Java often feel like they are fighting the static typing every step of the way, moving most correctness checks to runtime.

A hybrid language with functional and dynamic aspects might be quite interesting. I should add that despite Python's support for some functional tools like map() and lambda , Python does not have a functional-language subset: there is no type inferencing, and no opportunity for parallelization.

Why did you choose to support multiple paradigms?

Guido : I didn't really; Python supports procedural programming, to some extent, and OO. These two aren't so different, and Python's procedural style is still strongly influenced by objects (since the fundamental data types are all objects). Python supports a tiny bit of functional programming -- but it doesn't resemble any real functional language, and it never will. Functional languages are all about doing as much as possible at compile time -- the "functional" aspect means that the compiler can optimize things under a very strong guarantee that there are no side effects, unless explicitly declared. Python is about having the simplest, dumbest compiler imaginable, and the official runtime semantics actively discourage cleverness in the compiler like parallelizing loops or turning recursion into loops.

Python probably has the reputation of supporting functional programming based on the inclusion of lambda , map , filter , and reduce in the language, but in my eyes these are just syntactic sugar, and not the fundamental building blocks that they are in functional languages. The more fundamental property that Python shares with Lisp (not a functional language either!) is that functions are first-class objects, and can be passed around like any other object. This, combined with nested scopes and a generally Lisp-like approach to function state, makes it possible to easily implement concepts that superficially resemble concepts from functional languages, like currying, map, and reduce. The primitive operations that are necessary to implement those concepts are built are the primitive operations. You can write reduce() in a few lines of Python. Not so in a functional language.

When you created the language, did you consider the type of programmers it might have attracted?

Guido : Yes, but I probably didn't have enough imagination. I was thinking of professional programmers in a Unix or Unix-like environment. Early versions of the Python tutorial used a slogan something like "Python bridges the gap between C and shell programming," because that was where I was myself, and the people immediately around me. It never occurred to me that Python would be a

The fact that it was useful for teaching first principles of

How do you balance the different needs of a language that should be easy to learn for novices versus a language that should be powerful enough for experienced programmers to do useful things? Is that a false dichotomy?

Guido : Balance is the word. There are some well-known traps to avoid, like stuff that is thought to help novices but annoys

[Nov 09, 2017] Conversion of Perl to Python

Nov 09, 2017 | stackoverflow.com

I think you should rewrite your code. The quality of the results of a parsing effort depends on your Perl coding style. I think the quote below sums up the theoretical side very well. From Wikipedia: Perl in Wikipedia

Perl has a Turing-complete grammar because parsing can be affected by run-time code executed during the compile phase.[25] Therefore, Perl cannot be parsed by a straight Lex/Yacc lexer/parser combination. Instead, the interpreter implements its own lexer, which coordinates with a modified GNU bison parser to resolve ambiguities in the language.

It is often said that "Only perl can parse Perl," meaning that only the Perl interpreter (perl) can parse the Perl language (Perl), but even this is not, in general, true. Because the Perl interpreter can simulate a Turing machine during its compile phase, it would need to decide the Halting Problem in order to complete parsing in every case. It's a long-standing result that the Halting Problem is undecidable, and therefore not even Perl can always parse Perl. Perl makes the unusual choice of giving the user access to its full programming power in its own compile phase. The cost in terms of theoretical purity is high, but practical inconvenience seems to be rare.

Other programs that undertake to parse Perl, such as source-code analyzers and auto-indenters, have to contend not only with ambiguous syntactic constructs but also with the undecidability of Perl parsing in the general case. Adam Kennedy's PPI project focused on parsing Perl code as a document (retaining its integrity as a document), instead of parsing Perl as executable code (which not even Perl itself can always do). It was Kennedy who first conjectured that, "parsing Perl suffers from the 'Halting Problem'."[26], and this was later proved.[27]

Starting in 5.10, you can compile perl with the experimental Misc Attribute Decoration enabled and set the PERL_XMLDUMP environment variable to a filename to get an XML dump of the parse tree (including comments - very helpful for language translators). Though as the doc says, this is a work in progress.

Looking at the PLEAC stuff, what we have here is a case of a rote translation of a technique from one language causing another to look bad. For example, its rare in Perl to work character-by-character. Why? For one, its a pain in the ass. A fair cop. For another, you can usually do it faster and easier with a regex. One can reverse the OP's statement and say "in Perl, regexes are so easy that most of the time other string manipulation is not needed". Anyhow, the OP's sentiment is correct. You do things differently in Perl than in Python so a rote translator would produce nasty code. – Schwern Apr 8 '10 at 11:47

down vote Converting would require writing a Perl parser, semantic checker, and Python code generator.

Not practical. Perl parsers are hard enough for the Perl teams to get right. You'd be better off translating Perl to Python from the Perl AST (opcodes) using the Perl Opcode or related modules.

http://perldoc.perl.org/Opcode.html

Some notations do not map from Perl to Python without some work. Perl's closures are different, for example. So is its regex support.

In short, either convert it by hand, or use some integration modules to call Python from Perl or vice-versa

[Nov 07, 2017] Is PyCharm good - Quora

Nov 07, 2017 | www.quora.com

Cody Jackson , Python book author ( https://python-ebook.blogspot.com ) Answered Sep 11

I stumbled upon PyCharm a few years ago when my editor of choice (Stani's Python Editor) was no longer maintained. I haven't looked back.

I used the community edition for many years then decided to purchase a copy. While I don't necessarily need all the functionality of the paid version, I want to support the company in their work.

The PEP 8 notifications are nice to have. While PEP 8 is more of a guideline, it certainly helps ensure code looks nice and is easy to work with.

What's better, IMO, is the ability to load anything you want without having to explicitly download it. Import a module that isn't already on your system? PyCharm will let you know and offer to download it for you. Very handy.

I used to use GitKraken for GitHub work but the built-in VCS tools in PyCharm are just as easy to use, so I haven't bothered to download GitKraken for several months now. PyCharm highlights your modified files using color codes, so you know what you have updated, what's new, etc. so you know exactly what is going to be added in your next push. It also shows you what has changed between the different files using diff, which is handy.

PyCharm has built-in support for many different frameworks, the paid version obviously having more support. However, the free version includes Django, HTML, CSS, and JavaScript, which is sufficient for most people.

While the paid version has changed from a perpetual licenses to a subscription model, the monthly cost is only $8 per month for an individual, with certain discounts available.

Overall, PyCharm is the best proprietary Python editor and, unless you prefer completely FOSS software, there is no reason not to use it.

Yosef Dishinger , I dream in Python Answered Sep 28

The other answers have already said most of it, but I would just add that the search and code discovery features of PyCharm are superior to anything else I've used.

I work on a pretty large codebase, and with PyCharm you can search throughout the entire project, or even multiple projects, for a given string. Now it's true that other editors also have this feature, but PyCharm adds something here that other editors don't.

It lets you edit the code where the reference was found, in a panel within the search results window, and simply go through each search result one by one and see and modify the code in each section as you go, without needing to open the different files on their own.

At times when I've needed to do major refactoring this has been a lifesaver. It increased my productivity dramatically.

There are a lot of really nice editors out there, but I haven't come across anything like PyCharm for taming large codebases.

Edward Moseley , Python for programming, R for stats, C/C++ for microcontrollers Answered Aug 27 2016

I'm very much in agreement with User-9321510923064044481

If you begin to use a library that you don't have installed, PyCharm will let you know and makes the installation process really seamless. RStudio could actually probably take a page out of PyCharm's playbook, there.

I use the integrated python console very frequently for prototyping.

There's also this "Tip of the day" popup that I always mean to shut off but well sometimes they are good tips.

This may be nit-picky, but I especially agree that I don't use the integrated VCS , and until they find a more elegant way to integrate it I will stick to git on my command line.

[Nov 07, 2017] How to use Python interactively in PyCharm

Nov 07, 2017 | www.quora.com

Tony Flury , Freelance s/w developer Answered Apr 2

PyCharn when it starts will also start a python terminal as part of the project window. Look along the bottom where you will have tabs such as console and terminal.

PyCharn also offers integration with Jupiter notebook, but I haven't tried to use that feature yet.

Zdenko Hrcek , enjoying programming in Python Answered Apr 2

In main menu under Tools there is "Python console" option

Related Questions More Answers Below

[Nov 07, 2017] Should I use PyCharm for programming Python

Nov 07, 2017 | www.quora.com

AP Rajshekhar , Knows Java, Python, Ruby, Go; dabbled in Qt and GTK# Answered Sep 24, 2016

As with any other language, one does not need an IDE, which PyCharm is. However, it has been my experience that having an IDE improves productivity. Same is true with PyCharm.

If you are developing small applications that does not need git integration or PEP8 standards conformation, then you don't need PyCharm However, if you need any of the above, and do not want to use multiple tools (flake8, git-cli/git-cola) manually, then PyCharm is a good choice as it provides the following, apart from autocomplete, from within the IDE:

So, Pycharm improves your productivity quite a bit. Dominic Christoph , Met cofounders at a local meetup Updated Apr 5

It's obviously not necessary, and there are other free editors and IDEs. But in my experience, it is the best option.

I've used both Vim and Emacs and played with Sublime and Atom a bit. Those four editors allow you to highly customize your programming environment. Which some feel is a necessity.

They're all great, but you will miss out on some features that no one (that I know of; if you do, please share) has been able to properly recreate in a regular editor. Mainly, intelligent code navigation and completion. These are the most useful features that I've used, and PyCharm does them **almost** perfectly.

You'll spend much more time navigating code than you will typing code, so it's very helpful to be able to hit a keyboard shortcut and jump to a variable or method's definition/declaration. When you are typing, the intelligent autocomplete will be a big help as well. It's much more useable than the completion engines in editors because it only provides completions which are in scope. There're also Ctags and Gtags available for text editors but they are harder to use, must be customized for every language, and with any medium to large sized project work poorly. Though YMMV.

When it comes down to it, I prefer having features that work really well than the ability to customize. Download the community edition and see for yourself if it works for you. Especially for a beginner, it will save you the time of learning tools, which isn't as important as learning the language, because the UI is self-explanatory.

P.S.

I would find it unusable without the IdeaVim plugin. The keybindings of Vim are just too good to give up.

I should also mention that Jetbrains IDEs are very customizable themselves. The IdeaVim plugin even has a dotfile.

You'll also find videos on YouTube where programmers try to discourage others from using them because of the distracting number of panes. Though it has a distraction free mode and even without that, if you use it sensibly, you can have it only display the editor and tabs. Pandu Poluan , programmed in Python for nearly a year, to replace complex bash scripts. Answered Mar 24

You don't *have* to use PyCharm, but its features are so good *I* find it essential for Python development.

Things I can't live without:

There are many more PyCharm features, but all the above make PyCharm for me a must-have for Python development.

[Nov 07, 2017] Amazon.com Customer reviews Python Cookbook, Third edition

Notable quotes:
"... There are a couple of quick final points to make about the Python cookbook. Firstly it uses Python 3 and as many very useful third party modules haven't been ported from Python 2.X over to Python 3 yet Python 2.X is still probably still more widely used. ..."
"... this is a language learning book it's not aimed at the novice programmer, think of it more as language nuances and inflections for the experienced Pythonista rather than a how to learn Python book and you won't go far wrong. ..."
"... Most examples are self contained and all the code examples that I tried worked. Additionally, there is a GitHub that the authors created which provides all the code for the examples if you do not want type it yourself. The examples themselves were applied to real world problems; I could see how the recipe was used clearly. When the authors felt they could not provide an entire solution in the text, they point the correct place to visit online. ..."
"... But that's only the beginning. It's hard to describe the pleasure of reading some of the solutions in the Iterators and Generators section, for instance. Actually, I take that back. The pleasure is the same kind as what you may have felt when you first came upon ideas in books such as Bentley's Programming Pearls, way back when. ..."
"... The Active State repository of Python recipes includes many gems, but as the Authors observe in their preference: "most of these recipes are steeped in history and the past". ..."
Nov 07, 2017 | www.amazon.com

renaissance geek on June 23, 2013

The Python Domestic Science Textbook?

A few years ago now I was working in a job that required me to code in PERL. My PERL is passable but no better than that so when I found a copy of the PERL cookbook it was something of a life saver and constant companion. The PERL cookbook is deeply pragmatic and addresses real world problems with the language almost as an afterthought. (Which now I think about is actually a pretty good description of PERL anyway!) The Python cookbook is a very different beast and is much more an exercise in learning the intricacies and nuances of the language. I'm not sure cookbook is the right title - if the PERL Cookbook is a cookbook then the Python Cookbook is more of a domestic science textbook. A bit deeper, a bit dryer and not so focused on immediate problems. This is no way meant to imply that it's a bad book, on the contrary it's a very good book just not entirely what I was expecting.

The book itself is divided into fifteen large sections covering the likes of data structures and algorithms; functions; metaprogramming and concurrency with each section consisting of a number of problems. The problems are structured as a definition of the problem, a solution and a discussion of the solution and how it can be extended. Due to the nature of the Python language a large part of solving the problems lies in knowing which module(s) to include in your code so each problem is generally only a couple of pages, but that is certainly enough to give the solution and reasonably detailed discussion.

As with all books of this type there is going to be some complaints of why is X included and not Y and to be honest if you tried to cover all the possible problems a practicing python programmer is likely to run across the book would end up so large as to be unusable. That being said there was, for me at least, one glaring omission.

I do a lot of data processing with reasonably large data sets, and with the buzz around big data I'm sure I'm not the only one, and frequently find that I have to break down the data sets or I simply consume all the system resources and the program exits. I would have expected at least some treatment of working with very large data sets which seems to be entirely missing.

However this is an issue based on what I use Python for and may very well not matter to you. Even though there may not be exactly the solution you are looking for, there are 260 problems and solutions in the Python cookbook so if you don't learn something new you are probably a certified Python genius and beyond manuals anyway.

There are a couple of quick final points to make about the Python cookbook. Firstly it uses Python 3 and as many very useful third party modules haven't been ported from Python 2.X over to Python 3 yet Python 2.X is still probably still more widely used.

Secondly although this is a language learning book it's not aimed at the novice programmer, think of it more as language nuances and inflections for the experienced Pythonista rather than a how to learn Python book and you won't go far wrong.

Bluegeek on June 13, 2013
Review: "Python Cookbook" by David Beazley and Brian K. Jones; O'Reilly Media

The "Python Cookbook" is a book that brings the Python scripting language to O'Reilly's popular "Cookbook" format. Each Cookbook provides a series of "Recipes" that teach users common techniques that can be used to become productive quickly and as a reference to those who might've forgotten how to do something.

I reviewed this book in the Mobi e-book format. Reading it on Kindle for PC, the Table of Contents only shows the major sections rather than the individual recipes and this made it harder to find what I was looking for. This is apparently a limitation of Kindle for PC, since my Kindle 3 and Kindle for Android had no such issue.

When I use an O'Reilly "Cookbook", I judge it according to its' usefulness: Can I become productive quickly? Is it easy to find what I need? Does it provide helpful tips? Does it teach me where to find the answers to my questions?

This book is not targeted at new Python programmers, but that's where I'm at. The best way for me to learn a new scripting language is to dive right in and try to write something useful, and that was my goal for the "Python Cookbook". I also had "Learning Python" handy to cover any of the basics.

My first Python script was written to read in lists of subnets from two separate files and check that every subnet in list B was also in list A.

I used Recipe 13.3 to parse the command line options. Recipe 5.1 showed me how to read and write files. Recipe 2.11 taught me how to strip carriage returns out of my lines. Recipe 1.10, "Removing Duplicates from a Sequence while Maintaining Order", was very helpful and I was able to reuse the code in my own script. Recipe 2.14, "Combining and Concatenating Strings", helped me with my print statements. Considering this was the first Python script I ever wrote and that it ran, I consider both it and the "Python Cookbook" a success.

I had a bit more trouble with my second script. I was trying to write a script to find the subnet address given an interface address in CIDR notation. Recipe 11.4 introduced the ipaddress module, but this module refused to accept a string variable containing the interface in CIDR notation. I ended up installing another module (netaddr) I found via Google and things went better after that. I suspect the problem was that I was using ActivePython 2.7.2.5 [64 bit] and this book was written for Python 3.

As a DNS professional I was disappointed that there were no DNS-related recipes in the Network and Web Programming section, but Web-related topics were well-represented in the book.

The "Python Cookbook" doesn't seem to have quite the depth and organization of the "Perl Cookbook" but I'm sure I will rely on it heavily as I learn to use Python. It did allow me to be productive very quickly and it passes the "Cookbook" standard with flying colors. Any book that can get me to the point of writing a working, useful script in less than a day is worth using. I recommend this book to anyone who has a basic understanding of Python and wants to get past "Hello, World" and "Eat Spam" as fast as possible.

Reviewer's Note: I received a free copy of the "Python Cookbook" which was used to write this review.

William P Ross Enthusiast: Architecture on May 6, 2016
Treasure Trove of Python Recipes

Python Cookbook goes in depth on a variety of different Python topics. Each section is similar to a question that might be asked on Stack Overflow. The recipes range in difficulty from easy to advanced metaprogramming.

One particular recipe that I liked was 9.1 on how to time a function. When I am using Python I often need to time the code, and usually I need to look up how to do it. This example created a decorator function for timing. It makes it so that you can just put @timethis on top of a function and see how long it takes to execute. I appreciated how elegant this solution was as opposed to the way I was implementing it.

Most examples are self contained and all the code examples that I tried worked. Additionally, there is a GitHub that the authors created which provides all the code for the examples if you do not want type it yourself. The examples themselves were applied to real world problems; I could see how the recipe was used clearly. When the authors felt they could not provide an entire solution in the text, they point the correct place to visit online.

The range in topics was impressive. I found the most challenging chapters to be 9, 12, and 15 which were on metaprogramming, concurrency, and C Extensions. At the beginning of the book the recipes cover topics you would expect like data structures and algorithms, strings, and generators. I found myself surprised that I had not seen a lot of the techniques and solutions before. They were well crafted solutions, and I appreciated how much time and detail the authors must have spent to gather the information.

This is a great reference to have by your side when programming in Python.

Groundhog Day on June 30, 2015
Programming Pearls... Reloaded

Having read some humdrum works in the Cookbook series, my expectations were not very high. However, I soon discovered that this book is in a different league.

When he discusses a problem, Beazley gives you his favorite solution. He also presents alternatives, discusses pros and cons, and calls your attention to subtle details in the solution --- leaving you with a feeling of having learned something of value.

But that's only the beginning. It's hard to describe the pleasure of reading some of the solutions in the Iterators and Generators section, for instance. Actually, I take that back. The pleasure is the same kind as what you may have felt when you first came upon ideas in books such as Bentley's Programming Pearls, way back when.

I hadn't felt that excited about a programming book in a long time. This is one you can take along with you on a weekend just for the pleasure of sipping from it. Sad to say, but there are many O'Reilly books I feel like passing on soon after acquiring them. This one will have a special place on the shelves.

Devendra on September 1, 2013
Extensive tome of recipes for the Python 3 programmer

Python Cookbook is an extensive tome of recipes for the Python 3 programmer. It is a perfect companion book for those migrating Python 2 code to Python 3. If you are stuck with Python 2, you may still find the second edition of the book for sale, but the recipes may be dated as they cover Python 2.4. It is not a beginners book. If you are looking for a beginners book, I recommend Learning Python by Mark Lutz.

A quick chapter summary follows.

I've added this book to my list of references to look into, before heading to Google. Source code listings use syntax highlighting, a nice touch that makes the code easier, and less boring, to read.

I thank O'Reilly media for providing the book for review.

Dan on July 23, 2013
Wisdom - not just examples. Best viewed on a larger screen

The Active State repository of Python recipes includes many gems, but as the Authors observe in their preference: "most of these recipes are steeped in history and the past".

I'd add that the signal to noise ratio seems to be decreasing. The most prolific contributors (with the exception of Raymond Hettinger) have posted trivial examples rather than recipes. This book includes some simple examples too, but it's always in the context of a larger message. Excellent content and advice without the chaff.

I just bought this today. Unlike some early technical Kindle books I've purchased, the formatting is excellent. Kudos to the authors and publisher.

... ... ...

A. Zubarev on September 17, 2013
A book to read and come back again and again

I am tempted to state right away that this book is one of these rare "gems"! Absolutely worth every penny spent and perhaps even more in a way of getting more done in less time or even just can be used to advance professionally. So big thank you to Alex Martelli and David Ascher! I can't imagine how much time, energy, insight and effort the authors put into this book, but it is sure one of the longest professional books I have ever read.

Like I said, this book is very comprehensive at 608 pages long and touches most, if not all, aspects a typical IT pro would deal with in his or her professional life. It may appear though very dry, and in my opinion it should be, but it is the book to come back to again and again, time after time, year after year, so if you need a single specific recipe, you will not feel the book is very short thanks to the way it is structured.

I happen to actually use this book to cope with several assignments at work involving some medium to high complexity data processing for reporting purposes, thus more than a few recipes were used.

Namely, these were "Strings and Text" Ch. 2, "Numbers, Dates and Times" Ch. 3, "Files and I/O" Ch. 4, then hopped to "Functions" Ch. 7, which followed by "Parsing, Modifying and Rewriting XML" Ch. 6.6 and finally landed on "Integrating with a Relational Database" Ch. 6.8. I wish though chapter 7 "Functions" would precede most others because I think it belongs right after "Iterators and generators" which I needed to use as I expanded my program.

I must tell each did its magic, after all Python excels on processing text!

... ... ...

[Nov 07, 2017] Dive Into Python

Nov 07, 2017 | www.diveintopython.net

July 28, 2002

Dive Into Python is a free Python book for experienced programmers. It was originally hosted at DiveIntoPython.org, but the author has pulled down all copies. It is being mirrored here. You can read the book online, or download it in a variety of formats. It is also available in multiple languages . Read Dive Into Python

This book is still being written. You can read the revision history to see what's new. Updated 20 May 2004 . Email me if you'd like to see something changed/updated, or suggestions for this site. Download Dive Into Python

Dive Into Python in your language

Translations are freely permitted as long as they are released under the GNU Free Documentation License. Dive Into Python has already been fully or partially translated into several languages. If you translate it into another language and would like to be listed here, just let me know .

Republish Dive Into Python

Want to mirror this web site? Publish this book on your corporate intranet? Distribute it on CD-ROM ? Feel free. This book is published under the GNU Free Documentation License, which gives you enormous freedoms to modify and redistribute it in all its forms. If you're familiar with the GNU General Public License for software, you already understand these freedoms; the FDL is the GPL for books. You can read the license for all the details.

Copyright © 2000, 2001, 2002, 2003, 2004 Mark Pilgrim
Download Python

Learn Python

[Nov 07, 2017] pdb the Python Debugger

Sep 03, 2017 | docs.python.org
26.2. pdb ! The Python Debugger

Source code: Lib/pdb.py


The module pdb defines an interactive source code debugger for Python programs. It supports setting (conditional) breakpoints and single stepping at the source line level, inspection of stack frames, source code listing, and evaluation of arbitrary Python code in the context of any stack frame. It also supports post-mortem debugging and can be called under program control.

The debugger is extensible ! it is actually defined as the class Pdb . This is currently undocumented but easily understood by reading the source. The extension interface uses the modules bdb and cmd .

The debugger's prompt is (Pdb) . Typical usage to run a program under control of the debugger is:

>>> import pdb

>>> import mymodule

>>> pdb.run('mymodule.test()')

> <string>(0)?()

(Pdb) continue

> <string>(1)?()

(Pdb) continue

NameError: 'spam'

> <string>(1)?()

(Pdb)

pdb.py can also be invoked as a script to debug other scripts. For example:

python -m pdb myscript.py

When invoked as a script, pdb will automatically enter post-mortem debugging if the program being debugged exits abnormally. After post-mortem debugging (or after normal exit of the program), pdb will restart the program. Automatic restarting preserves pdb's state (such as breakpoints) and in most cases is more useful than quitting the debugger upon program's exit.

New in version 2.4: Restarting post-mortem behavior added.

The typical usage to break into the debugger from a running program is to insert

import pdb; pdb.set_trace()

at the location you want to break into the debugger. You can then step through the code following this statement, and continue running without the debugger using the command.

The typical usage to inspect a crashed program is:

>>> import pdb

>>> import mymodule

>>> mymodule.test()

Traceback (most recent call last):

  File "<stdin>", line 1, in <module>

  File "./mymodule.py", line 4, in test

    test2()

  File "./mymodule.py", line 3, in test2

    print spam

NameError: spam

>>> pdb.pm()

> ./mymodule.py(3)test2()

-> print spam

(Pdb)

The module defines the following functions; each enters the debugger in a slightly different way:

pdb. run statement , globals locals ]]
Execute the statement (given as a string) under debugger control. The debugger prompt appears before any code is executed; you can set breakpoints and type continue , or you can step through the statement using step or next (all these commands are explained below). The optional globals and locals arguments specify the environment in which the code is executed; by default the dictionary of the module __main__ is used. (See the explanation of the exec statement or the eval() built-in function.)
pdb. runeval expression , globals locals ]]
Evaluate the expression (given as a string) under debugger control. When runeval() returns, it returns the value of the expression. Otherwise this function is similar to run() .
pdb. runcall function , argument ,
Call the function (a function or method object, not a string) with the given arguments. When runcall() returns, it returns whatever the function call returned. The debugger prompt appears as soon as the function is entered.
pdb. set_trace ()
Enter the debugger at the calling stack frame. This is useful to hard-code a breakpoint at a given point in a program, even if the code is not otherwise being debugged (e.g. when an assertion fails).
pdb. post_mortem traceback
Enter post-mortem debugging of the given traceback object. If no traceback is given, it uses the one of the exception that is currently being handled (an exception must be being handled if the default is to be used).
pdb. pm ()
Enter post-mortem debugging of the traceback found in sys.last_traceback .

The run* functions and set_trace() are aliases for instantiating the Pdb class and calling the method of the same name. If you want to access further features, you have to do this yourself:

class pdb. Pdb completekey='tab' , stdin=None , stdout=None , skip=None
Pdb is the debugger class.

The completekey , stdin and stdout arguments are passed to the underlying cmd.Cmd class; see the description there.

The skip argument, if given, must be an iterable of glob-style module name patterns. The debugger will not step into frames that originate in a module that matches one of these patterns. [1]

Example call to enable tracing with skip :

import pdb; pdb.Pdb(skip=['django.*']).set_trace()

New in version 2.7: The skip argument.
run statement , globals locals ]]
runeval expression , globals locals ]]
runcall function , argument ,
set_trace ()
See the documentation for the functions explained above.
26.3. Debugger Commands

The debugger recognizes the following commands. Most commands can be abbreviated to one or two letters; e.g. h(elp) means that either or help can be used to enter the help command (but not he or hel , nor or Help or HELP ). Arguments to commands must be separated by whitespace (spaces or tabs). Optional arguments are enclosed in square brackets ( [] ) in the command syntax; the square brackets must not be typed. Alternatives in the command syntax are separated by a vertical bar ( ).

Entering a blank line repeats the last command entered. Exception: if the last command was a list command, the next 11 lines are listed.

Commands that the debugger doesn't recognize are assumed to be Python statements and are executed in the context of the program being debugged. Python statements can also be prefixed with an exclamation point ( ). This is a powerful way to inspect the program being debugged; it is even possible to change a variable or call a function. When an exception occurs in such a statement, the exception name is printed but the debugger's state is not changed.

Multiple commands may be entered on a single line, separated by ;; . (A single is not used as it is the separator for multiple commands in a line that is passed to the Python parser.) No intelligence is applied to separating the commands; the input is split at the first ;; pair, even if it is in the middle of a quoted string.

The debugger supports aliases. Aliases can have parameters which allows one a certain level of adaptability to the context under examination.

If a file .pdbrc exists in the user's home directory or in the current directory, it is read in and executed as if it had been typed at the debugger prompt. This is particularly useful for aliases. If both files exist, the one in the home directory is read first and aliases defined there can be overridden by the local file.

h(elp) [ command ]
Without argument, print the list of available commands. With a command as argument, print help about that command. help pdb displays the full documentation file; if the environment variable PAGER is defined, the file is piped through that command instead. Since the command argument must be an identifier, help exec must be entered to get help on the command.
w(here)
Print a stack trace, with the most recent frame at the bottom. An arrow indicates the current frame, which determines the context of most commands.
d(own)
Move the current frame one level down in the stack trace (to a newer frame).
u(p)
Move the current frame one level up in the stack trace (to an older frame).
b(reak) [[ filename :] lineno | function [, condition ]]

With a lineno argument, set a break there in the current file. With a function argument, set a break at the first executable statement within that function. The line number may be prefixed with a filename and a colon, to specify a breakpoint in another file (probably one that hasn't been loaded yet). The file is searched on sys.path . Note that each breakpoint is assigned a number to which all the other breakpoint commands refer.

If a second argument is present, it is an expression which must evaluate to true before the breakpoint is honored.

Without argument, list all breaks, including for each breakpoint, the number of times that breakpoint has been hit, the current ignore count, and the associated condition if any.

tbreak [[ filename :] lineno | function [, condition ]]
Temporary breakpoint, which is removed automatically when it is first hit. The arguments are the same as break.
cl(ear) [ filename:lineno | bpnumber [ bpnumber ]]
With a filename:lineno argument, clear all the breakpoints at this line. With a space separated list of breakpoint numbers, clear those breakpoints. Without argument, clear all breaks (but first ask confirmation).
disable [ bpnumber [ bpnumber ]]
Disables the breakpoints given as a space separated list of breakpoint numbers. Disabling a breakpoint means it cannot cause the program to stop execution, but unlike clearing a breakpoint, it remains in the list of breakpoints and can be (re-)enabled.
enable [ bpnumber [ bpnumber ]]
Enables the breakpoints specified.
ignore bpnumber [ count ]
Sets the ignore count for the given breakpoint number. If count is omitted, the ignore count is set to 0. A breakpoint becomes active when the ignore count is zero. When non-zero, the count is decremented each time the breakpoint is reached and the breakpoint is not disabled and any associated condition evaluates to true.
condition bpnumber [ condition ]
Condition is an expression which must evaluate to true before the breakpoint is honored. If condition is absent, any existing condition is removed; i.e., the breakpoint is made unconditional.
commands [ bpnumber ]

Specify a list of commands for breakpoint number bpnumber . The commands themselves appear on the following lines. Type a line containing just 'end' to terminate the commands. An example:

(Pdb) commands 1

(com) print some_variable

(com) end

(Pdb)

To remove all commands from a breakpoint, type commands and follow it immediately with end; that is, give no commands.

With no bpnumber argument, commands refers to the last breakpoint set.

You can use breakpoint commands to start your program up again. Simply use the continue command, or step, or any other command that resumes execution.

Specifying any command resuming execution (currently continue, step, next, return, jump, quit and their abbreviations) terminates the command list (as if that command was immediately followed by end). This is because any time you resume execution (even with a simple next or step), you may encounter another breakpoint!which could have its own command list, leading to ambiguities about which list to execute.

If you use the 'silent' command in the command list, the usual message about stopping at a breakpoint is not printed. This may be desirable for breakpoints that are to print a specific message and then continue. If none of the other commands print anything, you see no sign that the breakpoint was reached.

New in version 2.5.
s(tep)
Execute the current line, stop at the first possible occasion (either in a function that is called or on the next line in the current function).
n(ext)
Continue execution until the next line in the current function is reached or it returns. (The difference between next and step is that step stops inside a called function, while next executes called functions at (nearly) full speed, only stopping at the next line in the current function.)
unt(il)

Continue execution until the line with the line number greater than the current one is reached or when returning from current frame.

New in version 2.6.
r(eturn)
Continue execution until the current function returns.
c(ont(inue))
Continue execution, only stop when a breakpoint is encountered.
j(ump) lineno

Set the next line that will be executed. Only available in the bottom-most frame. This lets you jump back and execute code again, or jump forward to skip code that you don't want to run.

It should be noted that not all jumps are allowed ! for instance it is not possible to jump into the middle of a for loop or out of a finally clause.

l(ist) [ first [, last ]]
List source code for the current file. Without arguments, list 11 lines around the current line or continue the previous listing. With one argument, list 11 lines around at that line. With two arguments, list the given range; if the second argument is less than the first, it is interpreted as a count.
a(rgs)
Print the argument list of the current function.
p expression

Evaluate the expression in the current context and print its value.

Note

print can also be used, but is not a debugger command ! this executes the Python print statement.

pp expression
Like the command, except the value of the expression is pretty-printed using the pprint module.
alias [ name [command]]

Creates an alias called name that executes command . The command must not be enclosed in quotes. Replaceable parameters can be indicated by %1 , %2 , and so on, while %* is replaced by all the parameters. If no command is given, the current alias for name is shown. If no arguments are given, all aliases are listed.

Aliases may be nested and can contain anything that can be legally typed at the pdb prompt. Note that internal pdb commands can be overridden by aliases. Such a command is then hidden until the alias is removed. Aliasing is recursively applied to the first word of the command line; all other words in the line are left alone.

As an example, here are two useful aliases (especially when placed in the .pdbrc file):

#Print instance variables (usage "pi classInst")

alias pi for k in %1.__dict__.keys(): print "%1.",k,"=",%1.__dict__[k]

#Print instance variables in self

alias ps pi self

unalias name
Deletes the specified alias.
[!] statement

Execute the (one-line) statement in the context of the current stack frame. The exclamation point can be omitted unless the first word of the statement resembles a debugger command. To set a global variable, you can prefix the assignment command with a global command on the same line, e.g.:

(Pdb) global list_options; list_options = ['-l']

(Pdb)

run [ args ]

Restart the debugged Python program. If an argument is supplied, it is split with "shlex" and the result is used as the new sys.argv. History, breakpoints, actions and debugger options are preserved. "restart" is an alias for "run".

New in version 2.6.
q(uit)
Quit from the debugger. The program being executed is aborted.

Footnotes

[1] Whether a frame is considered to originate in a certain module is determined by the __name__ in the frame globals.

[Nov 06, 2017] Dive Deep Into Python Vs Perl Debate - What Should I Learn Python or Perl

Nov 06, 2017 | www.tecmint.com

2. Perl's Built-in Vs Python's 3rd Party Regex and OS Operations Support

Perl language borrows its syntax from and other UNIX commands like sed awk etc. due to which it has way powerful and built in regex support without importing any third-party modules.

Also, Perl can handle OS operations using built-in functions. On the other hand Python has third-party libraries for both the operations i.e. re for regex and os, sys for os operations which need to be ensured before doing such operations.

Perl's regex operations have ' sed ' like syntax which makes it easy not only for search operations but also replace, substitute and other operations on string can be done easily and swiftly than python where a person needs to know and remember the functions which cater to the need.

Example: Consider a program to search for digit in the string in Perl and Python

Python
Import re
str = 'hello0909there'
result = re.findall('\d+',str)
print result
Perl
$string =  'hello0909there';
$string =~ m/(\d+)/;
print "$& \n"

You see the syntax for Perl is way easy and inspired by sed command which takes advantage over Python's syntax which imports third party module 're'

  1. Dominix says: September 26, 2016 at 1:52 pm

    Python centric bullshit

  2. J Cleaver says: September 13, 2016 at 4:40 am

    Some of these perl examples don't really reflect what's more-or-less standard in the Perl community at any time since Perl 5 came out (15 years ago).

    Keeping in mind the vision of TMTOWTDI, your second Perl example:

    open(FILE,"%lt;inp.txt") or die "Can't open file";
    while() {
    print "$_"; }

    really would be typically written as just:

    open (FILE, "inp.txt") or die "Can't open file: $!";
    print while ();

    As many others have pointed out, Perl has a huge amount of syntax flexibility despite its overtones of C heritage, and that allows people to write working code in a relatively ugly, inefficient, and/or hard-to-read manner, with syntax reflecting their experience with other languages.

    It's not really a drawback that Perl is so expressive, but it does mean that the programmer should be as disciplined as the task warrants when writing it when it comes to understandable Perl idioms.

  1. David G. Miller says: September 12, 2016 at 3:04 am

    1) I've usually found that the clarity and elegance of a program have a lot more to do with the programmer than the programming language. People who develop clean solutions will do so regardless of the language of implementation. Likewise, those who can't program will find a way to force an ugly solution out of any language.

    2) Most systems administrators aren't programmers and have rarely had any formal training in software development.

    Put these two observations together and you will still get ugly, "write only" programs Before perl it was shell script, yesterday it was perl, today it's Python. Tomorrow someone will be asking for a replacement for Python because it's so hard to read and can't be maintained. Get used to it (but don't blame the programming language).

    I started my perl programming with perl 2.0 in 1993. It's still my "go to" programming language since it doesn't get in my way and I can get to a solution much faster than with C or shell script.

    • Joe Chakra says: September 12, 2016 at 3:18 am

      Actually performance does matter even for scripting. Imaginew filtering a 100 MB debug log. You could use AWK or gawk, sed or grep but Perl gives a lot more flexibility. Taking 5 seconds is very different to taking ten seconds, just because the more time between request and response the more likely you are to get distracted.

  1. D. B. Dweeb says: September 8, 2016 at 1:30 am

    The Pythonic file handling below surpasses the Perl example, the exception text and file close is automatic. Advantage Python!

    with open("data.csv") as f:
    for line in f:
    print line,

[Nov 06, 2017] Indentation Error in Python - Stack Overflow

Nov 06, 2017 | stackoverflow.com
I can't compile because of this part in my code:
    if command == 'HOWMANY':
        opcodegroupr = "A0"
        opcoder = "85"
    elif command == 'IDENTIFY':
        opcodegroupr = "A0"
        opcoder = "81"

I have this error:

Sorry: IndentationError: ('unindent does not match any outer indentation level', ('wsn.py', 1016, 30, "\t\telif command == 'IDENTIFY':\n"))

But I don't see any indentation error. What can be the problem?

Martijn Pieters ,Feb 20, 2013 at 11:54

You are mixing tabs and spaces.

Find the exact location with:

python -tt yourscript.py

and replace all tabs with spaces. You really want to configure your text editor to only insert spaces for tabs as well.

poke ,Feb 20, 2013 at 11:55

Or the other way around (depends on your personal preference) – poke Feb 20 '13 at 11:55

poke ,Feb 20, 2013 at 12:02

@MartijnPieters If you use tabs, you have tabs, so you do not need to care about its visual presentation. You should never mix tabs and spaces, but apart from that, just choose one and stick to it . You are right, it's a never-ending debate; it totally depends on your personal preference -- hence my comment. – poke Feb 20 '13 at 12:02

neil ,Feb 20, 2013 at 12:02

I have never understood why you would want to use spaces instead of tabs - 1 tab is 1 level of indent and then the size of that is a display preference - but it seems the world disagrees with me. – neil Feb 20 '13 at 12:02

Martijn Pieters ♦ ,Feb 20, 2013 at 12:13

@poke: That's very nice, but in any decent-sized project you will not be the only developer. As soon as you have two people together, there is a large chance you'll disagree about tab size. And pretending that noone will ever make the mistake of mixing tabs and spaces is sticking your head in the sand, frankly. There is a reason that every major style guide for OSS (python or otherwise) states you need to use spaces only . :-) – Martijn Pieters ♦ Feb 20 '13 at 12:13

geoffspear ,Feb 20, 2013 at 12:22

There should be one, and preferably only one, obvious way to do it. Following the style of the python codebase itself is obvious. – geoffspear Feb 20 '13 at 12:22

[Nov 06, 2017] Python Myths about Indentation

Nov 06, 2017 | www.secnetix.de

Python: Myths about Indentation

Note: Lines beginning with " >>> " and " ... " indicate input to Python (these are the default prompts of the interactive interpreter). Everything else is output from Python.

There are quite some prejudices and myths about Python's indentation rules among people who don't really know Python. I'll try to address a few of these concerns on this page.


"Whitespace is significant in Python source code."

No, not in general. Only the indentation level of your statements is significant (i.e. the whitespace at the very left of your statements). Everywhere else, whitespace is not significant and can be used as you like, just like in any other language. You can also insert empty lines that contain nothing (or only arbitrary whitespace) anywhere.

Also, the exact amount of indentation doesn't matter at all, but only the relative indentation of nested blocks (relative to each other).

Furthermore, the indentation level is ignored when you use explicit or implicit continuation lines. For example, you can split a list across multiple lines, and the indentation is completely insignificant. So, if you want, you can do things like this:

>>> foo = [
... 'some string',
... 'another string',
... 'short string'
... ]
>>> print foo
['some string', 'another string', 'short string']

>>> bar = 'this is ' \
... 'one long string ' \
... 'that is split ' \
... 'across multiple lines'
>>> print bar
this is one long string that is split across multiple lines

"Python forces me to use a certain indentation style."

Yes and no. First of all, you can write the inner block all on one line if you like, therefore not having to care about intendation at all. The following three versions of an "if" statement are all valid and do exactly the same thing (output omitted for brevity):

>>> if 1 + 1 == 2:
... print "foo"
... print "bar"
... x = 42

>>> if 1 + 1 == 2:
... print "foo"; print "bar"; x = 42

>>> if 1 + 1 == 2: print "foo"; print "bar"; x = 42

Of course, most of the time you will want to write the blocks in separate lines (like the first version above), but sometimes you have a bunch of similar "if" statements which can be conveniently written on one line each.

If you decide to write the block on separate lines, then yes, Python forces you to obey its indentation rules, which simply means: The enclosed block (that's two "print" statements and one assignment in the above example) have to be indented more than the "if" statement itself. That's it. And frankly, would you really want to indent it in any other way? I don't think so.

So the conclusion is: Python forces you to use indentation that you would have used anyway, unless you wanted to obfuscate the structure of the program. In other words: Python does not allow to obfuscate the structure of a program by using bogus indentations. In my opinion, that's a very good thing.

Have you ever seen code like this in C or C++?

/* Warning: bogus C code! */

if (some condition)
if (another condition)
do_something(fancy);
else
this_sucks(badluck);

Either the indentation is wrong, or the program is buggy, because an "else" always applies to the nearest "if", unless you use braces. This is an essential problem in C and C++. Of course, you could resort to always use braces, no matter what, but that's tiresome and bloats the source code, and it doesn't prevent you from accidentally obfuscating the code by still having the wrong indentation. (And that's just a very simple example. In practice, C code can be much more complex.)

In Python, the above problems can never occur, because indentation levels and logical block structure are always consistent. The program always does what you expect when you look at the indentation.

Quoting the famous book writer Bruce Eckel:

Because blocks are denoted by indentation in Python, indentation is uniform in Python programs. And indentation is meaningful to us as readers. So because we have consistent code formatting, I can read somebody else's code and I'm not constantly tripping over, "Oh, I see. They're putting their curly braces here or there." I don't have to think about that.


"You cannot safely mix tabs and spaces in Python."

That's right, and you don't want that. To be exact, you cannot safely mix tabs and spaces in C either: While it doesn't make a difference to the compiler, it can make a big difference to humans looking at the code. If you move a piece of C source to an editor with different tabstops, it will all look wrong (and possibly behave differently than it looks at first sight). You can easily introduce well-hidden bugs in code that has been mangled that way. That's why mixing tabs and spaces in C isn't really "safe" either. Also see the "bogus C code" example above.

Therefore, it is generally a good idea not to mix tabs and spaces for indentation. If you use tabs only or spaces only, you're fine.

Furthermore, it can be a good idea to avoid tabs alltogether, because the semantics of tabs are not very well-defined in the computer world, and they can be displayed completely differently on different types of systems and editors. Also, tabs often get destroyed or wrongly converted during copy&paste operations, or when a piece of source code is inserted into a web page or other kind of markup code.

Most good editors support transparent translation of tabs, automatic indent and dedent. That is, when you press the tab key, the editor will insert enough spaces (not actual tab characters!) to get you to the next position which is a multiple of eight (or four, or whatever you prefer), and some other key (usually Backspace) will get you back to the previous indentation level.

In other words, it's behaving like you would expect a tab key to do, but still maintaining portability by using spaces in the file only. This is convenient and safe.

Having said that -- If you know what you're doing, you can of course use tabs and spaces to your liking, and then use tools like "expand" (on UNIX machines, for example) before giving the source to others. If you use tab characters, Python assumes that tab stops are eight positions apart.


"I just don't like it."

That's perfectly OK; you're free to dislike it (and you're probably not alone). Granted, the fact that indentation is used to indicate the block structure might be regarded as uncommon and requiring to get used to it, but it does have a lot of advantages, and you get used to it very quickly when you seriously start programming in Python.

Having said that, you can use keywords to indicate the end of a block (instead of indentation), such as " endif ". These are not really Python keywords, but there is a tool that comes with Python which converts code using "end" keywords to correct indentation and removes those keywords. It can be used as a pre-processor to the Python compiler. However, no real Python programmer uses it, of course.
[Update] It seems this tool has been removed from recent versions of Python. Probably because nobody really used it.


"How does the compiler parse the indentation?"

The parsing is well-defined and quite simple. Basically, changes to the indentation level are inserted as tokens into the token stream.

The lexical analyzer (tokenizer) uses a stack to store indentation levels. At the beginning, the stack contains just the value 0, which is the leftmost position. Whenever a nested block begins, the new indentation level is pushed on the stack, and an "INDENT" token is inserted into the token stream which is passed to the parser. There can never be more than one "INDENT" token in a row.

When a line is encountered with a smaller indentation level, values are popped from the stack until a value is on top which is equal to the new indentation level (if none is found, a syntax error occurs). For each value popped, a "DEDENT" token is generated. Obviously, there can be multiple "DEDENT" tokens in a row.

At the end of the source code, "DEDENT" tokens are generated for each indentation level left on the stack, until just the 0 is left.

Look at the following piece of sample code:

>>> if foo:
... if bar:
... x = 42
... else:
... print foo
...
In the following table, you can see the tokens produced on the left, and the indentation stack on the right.
<if> <foo> <:> [0]
<INDENT> <if> <bar> <:> [0, 4]
<INDENT> <x> <=> <42> [0, 4, 8]
<DEDENT> <DEDENT> <else> <:> [0]
<INDENT> <print> <foo> [0, 2]
<DEDENT> [0]
Note that after the lexical analysis (before parsing starts), there is no whitespace left in the list of tokens (except possibly within string literals, of course). In other words, the indentation is handled by the lexer, not by the parser.

The parser then simply handles the "INDENT" and "DEDENT" tokens as block delimiters -- exactly like curly braces are handled by a C compiler.

The above example is intentionally simple. There are more things to it, such as continuation lines. They are well-defined, too, and you can read about them in the Python Language Reference if you're interested, which includes a complete formal grammar of the language.

[Nov 06, 2017] Modular Programming with Python by Erik Westra

Nov 06, 2017 | www.amazon.com
  • Paperback: 246 pages
  • Publisher: Packt Publishing - ebooks Account (May 26, 2016)
  • Language: English
  • ISBN-10: 1785884484
  • ISBN-13: 978-1785884481
  • Product Dimensions: 7.5 x 0.6 x 9.2 inches

Contents:

Modular Programming with Python

1. Introducing Modular Programming

2. Writing Your First Modular Program

3. Using Modules and Packages

4. Using Modules for Real-World Programming

5. Working with Module Patterns

6. Creating Reusable Modules

7. Advanced Module Techniques

8. Testing and Deploying Modules

9. Modular Programming as a Foundation for Good Programming Technique

By kievite on November 5, 2017

Great book on a very important topic. Highly recommended

Great book on a very important topic.

Python is complex language with even more complex environment and its module system is the critical part of it. For example Python standard library is structured as a collection of modules. The author gives you an excellent overview of Python module system, gives important recommendation about creation of your own modules (and provide several examples, including generator example in chapter 4, as well as warns about gotchas. Interplay between modules and namespaces covered in Chapter 7 is alone worth several times of the price of the book. for example few understand that the import statement adds the imported module or package to the current namespace, which may or may not be the global namespace. the author also covers the problem of "name masking" in this chapter.

Ability to write a large script using your own modules is a very important skill that few books teach. usually intro books on Python try to throw everything that language contains into the sink, creating problems for whose who study the language, even in cases when they already knew some other programming languages such as C++ or Perl. Ability not to cover some features of the language usually are complete absent in authors of such books.

Most of the authors of Python books talks a lot about how great Python is, but never explain why. this books explains probably the most important feature of this scripting language which makes is great (actually inhered from Modula 3). Also most intro books suffer from excessive fascination with OO (thanks God this fad is past its peak). This book does not.

Publishing of books that are devoted to important topics has great value as:you have nowhere to go to get information that it provides. But it is very risky business. Of cause if you are diligent you can collect this information by reading a dozen of book by extracting and organizing into some presentation relevant parts. But this is the work better reserved for personalities which corresponds to famous Sherlock Holms and it presuppose that you have pretty of time to do it. Usually meeting both of those two conditions is pretty unrealistic.

So it takes a certain about of courage to write a book devoted to a single specific feature of Python and the author should be commended for that.

That's why I highly recommend this book for anybody who is trying to learn the language. It really allow you to understand a single the most critical feature of the Python language.

The book contain 9 chapters. Here are the titles of those chapters:

1. Introducing Modular Programming
2. Writing Your First Modular Program
3. Using Modules and Packages
4. Using Modules for Real-World Programming
5. Working with Module Patterns
6. Creating Reusable Modules
7. Advanced Module Techniques
8. Testing and Deploying Modules
9. Modular Programming as a Foundation for Good Programming Technique

NOTE: In chapter 8 the author covers unrelated but an important topic about how to prepare your modules to publication and upload them to GitHub. Using GitHub became now very popular among Python programmers and the earlier you learn about this possibility the better.

Chapter 8 also covers important topic about installation of Python packages. But unfortunately the coverage is way to brief and does not cover gotchas that you might experience installing such packages as Numpy.

I would like to stress it again: currently the book has no competition in the level of coverage of this, probably the most important feature of Python language.

[Nov 04, 2017] Which is the best book for learning python for absolute beginners on their own?

Nov 04, 2017 | www.quora.com

Robert Love Software Engineer at Google

Mark Lutz's Learning Python is a favorite of many. It is a good book for novice programmers. The new fifth edition is updated to both Python 2.7 and 3.3. Thank you for your feedback! Your response is private. Is this answer still relevant and up to date?

Aditi Sharma , i love coding Answered Jul 10 2016

Originally Answered: Which is the best book for learning Python from beginners to advanced level?

Instead of book, I would advice you to start learning Python from CodesDope which is a wonderful site for starting to learn Python from the absolute beginning. The way its content explains everything step-by-step and in such an amazing typography that makes learning just fun and much more easy. It also provides you with a number of practice questions for each topic so that you can make your topic even stronger by solving its questions just after reading it and you won't have to go around searching for its questions for practice. Moreover, it has a discussion forum which is really very responsive in solving all your doubts instantly.

3.1k Views 11 Upvotes Promoted by Facebook Join Facebook Engineering Leadership. We're hiring! Join our engineering leadership team to help us bring the world closer together. Learn More at facebook.com/careers Alex Forsyth , Computer science major at MIT Answered Dec 28 2015 Originally Answered: What is the best way to learn to code? Specifically Python.

There are many good websites for learning the basics, but for going a bit deeper, I'd suggest MIT OCW 6.00SC. This is how I learned Python back in 2012 and what ultimately led me to MIT and to major in CS. 6.00 teaches Python syntax but also teaches some basic computer science concepts. There are lectures from John Guttag, which are generally well done and easy to follow. It also provides access to some of the assignments from that semester, which I found extremely useful in actually learning Python.

After completing that, you'd probably have a better idea of what direction you wanted to go. Some examples could be completing further OCW courses or completing projects in Python.

[Sep 18, 2017] Operators and String Formatting in Python Operators

Sep 18, 2017 | www.informit.com

InformIT

Formatting Strings!Modulus

Although not actually modulus, the Python % operator works similarly in string formatting to interpolate variables into a formatting string. If you've programmed in C, you'll notice that % is much like C's printf(), sprintf(), and fprintf() functions.

There are two forms of %, one of which works with strings and tuples, the other with dictionaries.

StringOperand % TupleOperand 

StringOperand % DictionaryOperand

Both return a new formatted string quickly and easily.

% Tuple String Formatting

In the StringOperand % TupleOperand form, StringOperand represents special directives within the string that help format the tuple. One such directive is %s, which sets up the format string

>>> format = "%s is my friend and %s is %s years old"

and creates two tuples, Ross_Info and Rachael_Info.

>>> Ross_Info = ("Ross", "he", 28)

>>> Rachael_Info = ("Rachael", "she", 28)

The format string operator (%) can be used within a print statement, where you can see that every occurrence of %s is respectively replaced by the items in the tuple.

>>> print (format % Ross_Info) 

Ross is my friend and he is 28 years old 



>>> print (format % Rachael_Info) 

Rachael is my friend and she is 28 years old

Also note that %s automatically converts the last item in the tuple to a reasonable string representation. Here's an example of how it does this using a list:

>>> bowling_scores = [190, 135, 110, 95, 195]

>>> name = "Ross"

>>> strScores = "%s's bowling scores were %s" \

...                                                 % (name, bowling_scores) 

>>> print strScores 

Ross's bowling scores were [190, 135, 110, 95, 195]

First, we create a list variable called bowling_scores and then a string variable called name. We then use a string literal for a format string (StringOperand) and use a tuple containing name and bowling_scores.

Format Directives

Table 3–6 covers all of the format directives and provides a short example of usage for each. Note that the tuple argument containing a single item can be denoted with the % operator as item, or (item).

Table 3–6 Format Directives
Directive Description Interactive Session
%s Represents a value as a string >>> list = ["hi", 1, 1.0, 1L]
>>> "%s" % list
"['hi', 1, 1.0, 1L]"
>>> "list equals %s" % list
"list equals ['hi', 1, 1.0, 1L]"
%i Integer >>> "i = %i" % (5)
'i = 5'
>>> "i = %3i" % (5)
'i = 5'
%d Decimal integer >>> "d = %d" % 5
'd = 5'
>>> "%3d" % (3)
' 3'
%x Hexadecimal integer >>> "%x" % (0xff)
'ff'
>>> "%x" % (255)
'ff'
%x Hexadecimal integer >>> "%x" % (0xff)
'ff'
>>> "%x" % (255)
'ff'
%o Octal integer >>> "%o" % (255)
377
>>> "%o" % (0377)
377
%u Unsigned integer >>> print "%u" % -2000
2147481648
>>> print "%u" % 2000
2000
%e Float exponent >>> print "%e" % (30000000L)
3.000000e+007
>>> "%5.2e" % (300000000L)
'3.00e+008'
%f Float >>> "check = %1.2f" % (3000)
'check = 3000.00'
>>> "payment = $%1.2f" % 3000
'payment = $3000.00'
%g Float exponent >>> "%3.3g" % 100
'100.'
>>> "%3.3g" % 1000000000000L
'10.e11'
>>> "%g" % 100
'100.'
%c ASCII character >>> "%c" % (97)
'a'
>>> "%c" % 97
'a'
>>> "%c" % (97)
'a'

Table 3–7 shows how flags can be used with the format directives to add leading zeroes or spaces to a formatted number. They should be inserted immediately after the %.

Table 3–7 Format Directive Flags
Flag Description Interactive Session
# Forces octal to have a 0 prefix; forces hex to >>> "%#x" % 0xff
have a 0x prefix '0xff'
>>> "%#o" % 0377
'0ff'
+ Forces a positive number to have a sign >>> "%+d" % 100
'+100'
- Left justification (default is right) >>> "%-5d, %-5d" % (10,10)
'10 , 10 '
" " Precedes a positive number with a blank space >>> "% d,% d" % (-10, 10)
'-100,10'
0 0 padding instead of spaces >>> "%05d" % (100,)
'00100'

Advanced Topic: Using the %d, %i, %f, and %e Directives for Formatting Numbers

The % directives format numeric types: %i works with Integer; %f and %e work with Float with and without scientific notation, respectively.

>>> "%i, %f, %e" % (1000, 1000, 1000) 

'1000, 1000.000000, 10.000000e+002'

Notice how awkward all of those zeroes look. You can limit the length of precision and neaten up your code like this:

>>> "%i, %2.2f, %2.2e" % (1000, 1000, 1000) 

'1000, 1000.00, 10.00e+002'

The %2.2f directive tells Python to format the number as at least two characters and to cut the precision to two characters after the decimal point. This is useful for printing floating-point numbers that represent currency.

>>> "Your monthly payments are $%1.2f" % (payment) 

'Your monthly payments are $444.43'

All % directives have the form %min.precision(type), where min is the minimum length of the field, precision is the length of the mantissa (the numbers on the right side of the decimal point), and type is the type of directive (e, f, i, or d). If the precision field is missing, the directive can take the form %min(type), so, for example, %5d ensures that a decimal number has at least 5 fields and %20f ensures that a floating-point number has at least 20.

Let's look at the use of these directives in an interactive session.

>>> "%5d" % (100,) 

' 100' 

>>> "%20f" % (100,) 

' 100.000000'

Here's how to truncate the float's mantissa to 2 with %20.2f.

>>> "%20.2f" % (100,) 

' 100.00'

The padding that precedes the directive is useful for printing rows and columns of data for reporting because it makes the printed output easy to read. This can be seen in the following example (from format.py ):

     # Create two rows

row1 = (100, 10000, 20000, 50000, 6000, 6, 5) 

row2 = (1.0, 2L, 5, 2000, 56, 6.0, 7) 



      # 

      # Print out the rows without formatting 

print "here is an example of the columns not lining up" 

print ´row1´ + "\n" + ´row2´ 

print 

      # 

      # Create a format string that forces the number

      # to be at least 3 characters long to the left

      # and 2 characters to the right of the decimal point

format = "(%3.2e, %3.2e, %3.2e, %3.2e, " + \ "%3.2e, %3.2e, %3.2e)" 



      # 

      # Create a string for both rows

      # using the format operator

strRow1 = format % row1 

strRow2 = format % row2 

print "here is an example of the columns" + \ 

        " lining up using \%e" 



print strRow1 + "\n" + strRow2 

print 



      # Do it again this time with the %i and %d directive 

format1 = "(%6i, %6i, %6i, %6i, %6i, %6i, %6i)" 

format2 = "(%6d, %6d, %6d, %6d, %6d, %6d, %6d)" 

strRow1 = format1 % row1 

strRow2 = format2 % row2 

print "here is an example of the columns" + \ 

        " lining up using \%i and \%d" 



print strRow1 + "\n" + strRow2 

print 



here is an example of the columns not lining up 

(100, 10000, 20000, 50000, 6000, 6, 5) 

(1.0, 2L, 5, 2000, 56, 6.0, 7) 



here is an example of the columns lining up using \%e 

(1.00e+002, 1.00e+004, 2.00e+004, 5.00e+004, 6.00e+003, 6.00e+000, 5.00e+000) 

(1.00e+000, 2.00e+000, 5.00e+000, 2.00e+003, 5.60e+001, 6.00e+000, 7.00e+000) 



here is an example of the columns lining up using \%i and \%d 

( 100, 10000, 20000, 50000, 6000, 6, 5) 

(     1,         2,         5,   2000,     56, 6, 7)

You can see that the %3.2e directive permits a number to take up only three spaces plus the exponential whereas %6d and %6i permit at least six spaces. Note that %i and %d do the same thing that %e does. Most C programmers are familiar with %d but may not be familiar with %i, which is a recent addition to that language.

String % Dictionary

Another useful Python feature for formatting strings is StringOperand % Dictio-naryOperand. This form allows you to customize and print named fields in the string. %(Income)d formats the value referenced by the Income key. Say, for example, that you have a dictionary like the one here:

Monica = { 

                 "Occupation": "Chef",

                 "Name" : "Monica", 

                 "Dating" : "Chandler",

                 "Income" : 40000 

                  }

With %(Income)d, this is expressed as

>>> "%(Income)d" % Monica 

'40000'

Now let's say you have three best friends, whom you define as dictionaries named Monica, Chandler, and Ross.

Monica = { 

                 "Occupation": "Chef",

                 "Name" : "Monica", 

                 "Dating" : "Chandler", 

                 "Income" : 40000 

                 } 



Ross = 

                               { 

                "Occupation": "Scientist Museum Dude",

                "Name" : "Ross", 

                "Dating" : "Rachael", 

                "Income" : 70000 

                } 



Chandler =              { 

                "Occupation": "Buyer",

                "Name" : "Chandler", 

                "Dating" : "Monica", 

                "Income" : 65000 

                }

To write them a form letter, you can create a format string called message that uses all of the above dictionaries' keywords.

message = "%(Name)s, %(Occupation)s, %(Dating)s," \ 

                  " %(Income)2.2f"

Notice that %(Income)2.2f formats this with a floating-point precision of 2, which is good for currency. The output is

Chandler, Buyer, Monica, 65000.00