Softpanorama

Home Switchboard Unix Administration Red Hat TCP/IP Networks Neoliberalism Toxic Managers
May the source be with you, but remember the KISS principle ;-)
Skepticism and critical thinking is not panacea, but can help to understand the world better

Python -- Scripting language with generators and coroutines

Python for Perl programmers

News Scripting Languages

Best Python books for system administrators

Recommended Links Python Braces Debate Command-Line Syntax and Options Python for Perl programmers
Tutorials Python coroutines Generators as semi-coroutines Braces Problem Programming environment Python IDEs Pycharm IDE
Python istallation Compiling Python from source Installing Python 3 from RPMs Installing Python Packages Jython    
Debugging in Python pdb — The Python Debugger Quotes Python history Tips Etc  
 
A new competitor seemed to emerge out of the woodwork every month or so. The first thing I would do, after checking to see if they had a live online demo, was look at their job listings. After a couple years of this I could tell which companies to worry about and which not to. The more of an IT flavor the job descriptions had, the less dangerous the company was.
  • The safest kind were the ones that wanted Oracle experience. You never had to worry about those.
    You were also safe if they said they wanted C++ or Java developers.
  • If they wanted Perl or Python programmers, that would be a bit frightening-- that's starting to sound like a company where the technical side, at least, is run by real hackers.
  • If I had ever seen a job posting looking for Lisp hackers, I would have been really worried.

-- Paul Graham co-founder, Viaweb


Introduction

This is the forth page of an ongoing series of pages covering major scripting language topics for the Unix/Linux system administrators  (others cover Unix shells, Perl, and  TCL). based of Professor Bezroukov lectures.  

Python now is becoming the language Unix/Linux sysadmin must know at least on superficial level, as for many users, especially researchers, it is the primary language iether for development, or for writing supporting scripts. Python has been hailed as a really easy to learn language. It's not completely true, in fact it is as complex language as Perl, with its own  set of warts. But it is an easier language to start with and that's why got a strong foothold at universities  displacing Java (repeating path of Pascal in this regard, but on  a new level). The rest is history.

As most sysadmins know Perl, Python should be easy to learn at basic level: the key concepts are similar and Python is the language greatly influenced by Perl, incorporating some European and, more specifically, Nicklaus Wirth ideas about programming languages (Of Pascal, Modula and Modula2 fame). Python's core syntax and certain aspects of its philosophy were also influenced from ABC: a language for teaching programming that Wirth tried to create at the end of his career.   Python does contains quite a few annoyances and strange ad hoc rules, especially in its OO and lambda features, but the core language is mostly OK.

Python definitely avoided several traps in which Larry Waal fall such as forced conversion based on the type of operation, C-style curvy brackets with their chronic problem of unclosed (run away) bracket, obligatory semicolon at the end of the line  (inherited from C, which in turn inherited it from PL/1). It also does not use sigils, which make the program in Python superficially look "more normal." In reality sigils are one of lesser Perl problems and people usually adapt well to them ;-).  Forced conversion based on type of operation, abuse of both round and curvy brackets, mandatory semiconlon at the end of  statements and attempt to match Python in OO (the last refuge of Perl complexity junkies) are much more serious problems.

But Python fell into C++ trap -- too much of functionality of the language is offloaded to standard library. One especially pitiful is regular expression engine.  The level of integration with Unix is lesser in Python then in Perl which was designed as a super ser of shell, AWK and SED. 

While beginner can learn the basic of the language rather quickly (and that's why it was adopted by Universities as the first language )  in reality Python is a complex language which is pretty difficult to master. Semantics is difficult from Perl enough to it painful and annoying to learn for accomplished Perl programmers as Perl skills and idioms are interfering with Pythons.  You feel like a accomplished ball room dancer put on the ice ring. You need to re-learn a lot of things. And fall a lot of times.  Moreover, all this hype  about Python being easier to read and understand is what is it: programming language hype. It reflect one thing: inability to distinguish "programming in the small" from "programming in the large". I think Philic J Guo put it well (Philip Guo - Hey, your Python code is unreadable! (

I argue that code written in Python is not necessarily any more easily readable and comprehensible than code written in other contemporary languages, even though one of Python's major selling points is its concise, clear, and consistent syntax. While Python code might be easier to comprehend 'in-the-small', it is not usually any easier to comprehend 'in-the-large'.

That's the key: Python code is usually more difficult to comprehend in the large that code from Perl because it is more verbose -- the same task requires significantly more line of code then in Perl: in comparison with Perl, Perl has  lower language level, requeuing ,ore line of codes (and tokens) to express the same algorithms.  In OO is abused the difference frequently is over 50%.

And the number  of errors and the difficulty of comprehending the program is directly proportional to the number of lines of code, no matter what language you use.  So perl obsession with compact coding at the end has some value in this respect.

And the number  of errors and the difficulty of comprehending the program is directly proportional to the number of lines of code, no matter what language you use.  So Perl obsession with expressiveness, "Huffman coding" of some language constructs (with frequently used constructs having special notation) and compact coding has value in this respect.

Moreover Python program often suffer from abuse of OO (sometimes to a horrible level), leading to OO-spaghetti and programs several times more complex than they should be.

Python programs can be decomposed into modules, statements, expressions, and objects, as follows:

  1. Programs are composed of modules.
  2. Modules contain statements.
  3. Statements contain expressions.
  4. Expressions create and process objects.

First versions of Python did not introduced any new ideas and was by-and-large  just a more cleaner rehash of ideas Perl with the addition of idea of modules from Module 3.  It was released in 1991, the same year as Linux. Wikipedia has a short article about Python history:  History of Python - Wikipedia, the free encyclopedia

But starting with version 2.2 it added support of semi-coroutines in a platform independent way (via generators, the concept inspired by Icon) and became class of its own.  I think that this makes Python in certain areas a better scripting language then other members of the "most popular troika" (Perl, PHP and JavaScript). And it is definitly more popular language then  Ruby.  Availability of semi-coroutines favorably distinguish Python from Perl.

Python popularity to a large extent stems from its attractiveness to novices ("Peter Naur effect"). So in a sense Python replayed PHP sycess on a new level. Python also was lucky that for some time it enjoyed full support of Google (which employed for a long time Python creator,  Guido van Rossum). In addition, Microsoft also supported it indirectly (via Iron Python and support In Visual Studio and other tools like Expressions Web).  That created some money flow for development and put Python in much better position then Perl, which after it lost O'Reilly sponsorship (despite the fact that O'reilly as a published earned millions on Perl books) does not have powerful corporate sponsor and this development is lingering in obscurity.

Even in 2017 Python still enjoy some level of support of Google, and that are no similar sponsor for iether Ruby or R, or Perl.  Out of modern languages only  JavaScript survived a semi-abandoned status (after collapse of Netscape),  but it paid heavy price both in terms of speed of development of the language and popularity :-(.

Python suffers from severe case of NIH syndrome

The major problem I have with  Python is  falls in "NIH" syndrome trap. For example instead of providing clean, innovative interface to exiting classic Unix utilities (may be using their compilation is a special mode) it essentially tries to reemployment all Unix utilities with its own.

That does not only dramatically increases complexity and make it less attractive to Unix system administrators. It raises serious questions about deep provinciality of Guido van Guido van Rossum ( who at the time of this writing is over 60 and as such essentially ended his programmer career ) and his major coworkers.

At the same time of this writing Python interface with Unix utilities is immature and changes from one version of language to another.  In this sense Python is definitely inferior to Perl, and even to shell.

Python ecosystem

Language itself has little or no value without full language ecosystem, which includes debugger, IDE, books, Web site, forums, test suits, etc.  Python does have a debugger, which became quite decent after , say, version 2.7.

 Currently Python has the most developed ecosystem among all scripting languages with a dozen of high quality IDE available (Perl has almost none, although Komodo Edit can be used as a proxy) . Probably pycharm being  the most popular (and it has a free version for individual developers)

Starting from version 2.7 debugger supports almost the same set of operation as famous Perl debugger. In other words at last it manage to bridge the gap with Perl debugger.  Better late then never ;-).  From version 3.4 the debugger can be invoked from within the script, the feature the Perl has for ages.

While from 1990 to 2000 and avalanche of Perl books was published (although surprisingly few of them were of decent quality and they were mostly non-O'Reilly books ;-) , now new Perl books are a very rare event. Python here definitely dominates. There are probably more books published about Python then any other scripting language. In 2017 books about Python are still baked like there is no tomorrow, but like was the case with Perl before most of them are of poor or average quality.  Especially written  by OO zealots. This amount of "junk books" is an interesting  feature of the language. Many Python books should be avoided at all costs as they do not explain the language but obscure it.  You are warned.

One sign of the popularity of a scripting language is availability of editors which use it as macro language. Here Python outshine Perl by several orders of magnitude

see PythonEditors - Python Wiki

Komodo Editor is a decent quality middleweight free editor  that supports writing add-ons in Python.  See Macro API Reference

The first language effect

I would like to stress again that  one of the main reasons that Python is so popular is so called "first language effect" (aka "Peter Naur effect" as Peter Naur was the first to mention the profound importance of being accessing to beginners, effect which first demonstrated itself with Basic) -- language that university taught to students as the first programming language (and often the only programming language) has tremendous advantage over alternatives.

Python was adopted as the first language for teaching programming at many universities, and that produces steady stream of language users and some enthusiasts.  And after the language found its niche in intro university courses  the proverb "nothing succeed like success" is fully applicable.

Infatuation with Java in US universities decled substantially from the later of late 90th, beacuse Java does not represent a step forward in development of programming language (both Perl and Python do represent such a step). As skeptics often call it it is Cobol for XXI century with the garbage collection on top of it.  

As for programming language there is nothing interesting in Java  in comparison with C and C++ -- it is one step forward and two step back.

And Python is definitely preferable to Java as an introductory programming language: Python interpreter has interactive mode, suitable for small examples and it more or less forgiving for novices. It also has decent debugger, although not of quality of Perl debugger. At least, Python cause much less frustration for novices then iether Java or Perl (for Perl novices have great difficulty mastering usage of sigils $ @ and %, suffer from mandatory semicolon at the end of statements and abuse of brackets.)

No obligatory semicolon at the end of the statements helps greatly. The same is true for no C-style curvy bracket delimited blocks (the source of a lot of grief for beginners as missing  curvy bracket is difficult to locate).

Also it has more or less regular lexical structure and simpler syntax then both Java and Perl which both belong  to C-style language, whle Python does not -- it is a class of its own (due to its very complex lexical structure and syntax Perl is as horrible as a beginner language; But at the same time for Unix sysadmins the semantic of Perl is closer to "Unix spirit" and this better understandable if you came from Bash and C (typical two languages that sysadmin know; or used to know in the past). For example in Perl $line is a dereferencing of the pointer \$line and represents the value at this location, while in Python line is both pointer to the location and the value.  

Overcomplexity: both Python and Perl are above the level on ordinary human comprehension
and need to be operated as subsets

Despite some claim that Python adhere to simplicity this is simply is not true.  It just pushed a lot of complexity into standard library, the trick previously successfully used by C, but with the same disastrous results (C treatment of strings as byte arrays, not as a separate data type like in PL/1  is in retrospect a disaster).

All-in-all this is a large, complex non-orthogonal language, not that different  in this respect from Perl. Just  a set of warts is different. and while Perl 5 changes little for the last ten years due to the lack of development funds, Python  adds and adds complexity.  that's why in some areas like bioinfomatic related statistics R gains ground at the expence of Python. R is more C-style (it preserves curvy brackets for blocks) higher level language the Python with better organized and documented libraries of packages and Perl style repository CRAN.

The level of interaction of some important scripting languages constructs in the language (regular expression, I/O) in Python is worse then in Perl although Python 3.7 and higher improved in this area , at last introducing something that resemble in convenience to Perl double quoted strings ;-)

And look at discussion about how to execute pipelines and external command in Python. At this point  any illusion that this is orthogonal language should disappear. In this area it is more baroque then Perl. 

Pushing part of complexity into the set of core modules solves nothing, because the computational environment in which Python operates (Unix or Windows) still has the same level of complexity and that level of complexity should be reflected in the language one way or another.

It is important to understand the "real Python" is a large and  complex language and a decent textbook such as  Learning Python, 5th Edition by Mark Lutz (2013) is  over thousand pages. Modest intro like Introducing Python Modern Computing in Simple Packages by Bill Lubanovic is 484 pages. Slightly more expanded intro  Python Essential Reference (4th Edition)  by David Beazley is over 700 pagers. O'Reilly cookbook is 706 pages. And so on and so forth. Humans can't learn such a large language and need to operate with  a subset.

So despite claims to the opposite Python belong to the same family of "complex" languages like Perl -- languages no single human can fully master in his lifetime.  Of course,  like with natural language comprehension, the level of knowledge of the language greatly vary with some individual being able to absorb higher level of complexity, but that does not changes the situation "in general".

The key reflection of this overcomplexity is the level of which Python presented in most popular books: there is no single book that like in C creates solid understanding of how language constructs are mapped into actually computer implementation. In this sense Python will never be able to match C. 

The means that contrary to hype both Perl and Python belong to the class of language with an attitude of "whatever gets the job done" although Python pretends (just pretends) to follow KISS principle: on the surface Python designers seems preferring simplicity and consistency in design to flexibility and multiple way of performing the same operation that Perl advocates (and something goes too far with this concept ;-)   

Version 2.7 vs. 3.7 problem

There is no single Python language. There are two dialects which are often called 3.x and 2.x.  Version 2.7 is now dominant and this is what can be called "classic Python" Adoption of version 3.x is still low and there is not much enthusiasm for this transition due to the size of Python 2 codebase, so the plans is to force migration in Jan 2020.  The biggest difference between contemporary Python 3 and legacy Python 2 is the handling of strings:

String differences between Python 3 and Python 2

The biggest difference between contemporary Python 3 and legacy Python 2 is the handling of strings. In versions of Python up to and including Python 2 the str type was a so-called byte string, where each character was encoded as a single byte. In this sense, Python 2 str was similar to the Python 3 bytes, however, the interface presented by str and bytes is in fact different in significant ways.

In particular their constructors are completely different and indexing into a bytes object returns an integer rather than a single code point string. To confuse matters further, there is also a bytes type in Python 2.6 and Python 2.7, but this is just a synonym for str and as such has an identical interface.

If you're writing text handling code intended to be portable across Python 2 and Python 3 – which is perfectly possible – tread carefully!

There are some other differences. For example print  is a statement in Python 2 but is function print() in Python 3.

Currently Python 2.7 is the dominant version of Python for scientific and engineering computing (although the standard version that comes with RHEL 6.x is still python 2.6). 64-bit version is dominant .

But 3.x version is promoted by new books and gradually (albeit very slowly) gains in popularity too.  Python developers claim that they will stop support ofverion 2.7 in January 2020, which will be a big shock to Python community.

Still while Python 3 handles strings as two bytes arrays is a big change, there are some silver lining inthis dark cloud

For example,  Python 3 has  better support for coroutines. Here is quote form Fluent Python, chapter 16):

The infrastructure for coroutines appeared in PEP 342 — Coroutines via Enhanced Generators, implemented in Python 2.5 (2006): since then, the yield keyword can be used in an expression, and the .send(value) method was added to the generator API. Using .send(…), the caller of the generator can post data that then becomes the value of the yield expression inside the generator function. This allows a generator to be used as a coroutine: a procedure that collaborates with the caller, yielding and receiving values from the caller.

In addition to .send(…), PEP 342 also added .throw(…) and .close() methods that respectively allow the caller to throw an exception to be handled inside the generator, and to terminate it. These features are covered in the next section and in “Coroutine Termination and Exception Handling”.

The latest evolutionary step for coroutines came with PEP 380 - Syntax for Delegating to a Subgenerator, implemented in Python 3.3 (2012). PEP 380 made two syntax changes to generator functions, to make them more useful as coroutines:

These latest changes will be addressed in “Returning a Value from a Coroutine” and “Using yield from”.

Let’s follow the established tradition of Fluent Python and start with some very basic facts and examples, then move into increasingly mind-bending features.

Note on the difference between Python assignments and Perl assignments

The following explanation destroys the idea that  Python as s simple language. In Perl all variable types defined in the language (scalars,  lists, arrays and hashes are copied "by value", That means that a new copy is create as the result of assignment.

Python behaves like Perl for numeric values and strings and lists, but differently for hashes (called dictionaries in Python)

Novices are usually are not told about this nuance and happily write programs which usually behave as expected, but in reality they just do not understand the level of complexity of the language with hidden references. See assignment in python - Stack Overflow for more information:

... ... ...

I like to picture variables in python as the name written on 'labels' that are attached to boxes but can change its placement by assignment, whereas in other languages, assignment changes the box's contents (and the assignment operator can be overloaded).

Beginners can write quite complex applications without being aware of that, but then they are burned (python - Copied variable changes the original):

André Freitas, Nov 14, 2011 at 13:56

I have a simple problem in Python that is very very strange.
def estExt(matriz,erro):
    # (1) Determinar o vector X das soluções
    print ("Matrix after:");
    print(matriz);

    aux=matriz;
    x=solucoes(aux); # IF aux is a copy of matrix, why the matrix is changed??

    print ("Matrix before: ");
    print(matriz)

...

As you see below, the matrix matriz is changed in spite of the fact that aux is the one being changed by the function solucoes() .

Matrix before:
[[7, 8, 9, 24], [8, 9, 10, 27], [9, 10, 8, 27]]

Matrix after:
[[7, 8, 9, 24], [0.0, -0.14285714285714235, -0.2857142857142847, -0.42857142857142705], [0.0, 0.0, -3.0, -3.0000000000000018]]

André Freitas ,Nov 14, 2011 at 17:16

The line
aux=matriz;

Does not make a copy of matriz , it merely creates a new reference to matriz named aux . You probably want

aux=matriz[:]

Which will make a copy, assuming matriz is a simple data structure. If it is more complex, you should probably use copy.deepcopy

aux = copy.deepcopy(matriz)

As an aside, you don't need semi-colons after each statement, python doesn't use them as EOL markers.

André Freitas ,Nov 15, 2011 at 8:49

Use copy module
aux = copy.deepcopy(matriz) # there is copy.copy too for shallow copying

Minor one: semicolons are not needed.

aux is not a copy of matrix , it's just a different name that refers to the same object.

Modules and OO

The key Python feature is the ability to use modules. In this sense it is a derivative of Modula, which was a really revolutionary language by Nicklaus Wirth. Actually the only older mainstream language that implemented coroutines. Modules is a great language feature in its own right. In Python OO features are bolted on top of this and somewhat spoil the broth.

While Python provides OO features, like C++ can be used without them.  They were added to language, not present from the very beginning. And like in many other languages with OO features they became fashionable and promoted the language. Also OO features are badly abused in Python scripts -- I saw many cases when Linux/Unix maintenance script were written using OO features. Which makes them less maintainable and  the code twice or more verbose and obscure. 

While pointers to memory structures (aka objects) is how OO is implemented, unlike Perl Python it does not provide pointers as a separate data type.  You can use pointers via object oriented framework, but generally this is a perversion.  I think that the pointers is such a valuable programming construct that they should be  presented as a separate data type. All complex structures in Python are actually typed pointers, but  this is too little too late.

Python is shipped with all version of Linux not other Unix flavors

Currently Python is shipped as a standard component only with Linux and FreeBSD. Nether Solaris, not AIX or HP-UX include Python by default (but they do include Perl).

Of course, popularity of those flavors of Unix is in decline, but they hold their  niche in certain areas like large databases processing.

Quantity and quality of library of available modules

By the number and quality of available modules Python is now competitive and in some areas exceeds Perl with its famous CPAN. Like Perl, Python also has large library of standard modules shipped with the interpreter. And actually they are much better documented. Perl modules recently were partially spoiled by the switch to object oriented style of programming advocated by  some PErl advocates, so previous advantage in simplicity that Perl used to hold, now disppered. It is now often painful to read modules in Perl standard library.

Among useful for sysadmins standard modules in Python we can mention the following

The fact that regular expression are not a part of the language like in Perl is a setback as they proved to be very useful in text processing and that  diminish expressive power of the language for many text processing tasks.

But Python also faces competition from more modern languages such as Ruby and, paradoxically, from R. Although still less popular Ruby competes with Python on features, especially for among programmers who value coroutines paradigm of software development (and it is really a paradigm, not just a language feature). Python provides generators that are close enough, but still...

Paradoxically Python managed to get a foot into numeric computing

Python also get a foot into numeric computing via SciPy/NumPy. It is now widely used in molecular modeling area, which was previously dominated by compiled language (with Fortran being the major). In genomic applications it also managed to displace Perl  beacuse new  generation of bioinformatics engineers are trained at universities using Biopython (although quality of regular expression integration into the language is much lower in Python). Paradoxically and unexplainable to me now R is pushing Python out in this domain.

Python displaced Perl in bioinformatics

While Perl was dominant in early days of bioinformatics, this dominance did not last. Now it is almost completely displaced by Python  tin this area. BioPython displaced BioPerl, and new books  are covering almost exclusively BioPython. That's somewhat strange as Perl has more efficient implementation of strings (in Python strings are immutable objects, in Perl they are mutable, and if operation shortens the string or produced the string of the same language, no new memory allocation is needed)

Perl also has regular expression better integrated into the language.

Indentation as a proxy for nesting -- questionable design decision

Python imitates FORTRAN 4 -- it uses an indentation to denote nesting, similarly like Fortran 4 used it to distinguish between labels and statements ;-).

That creates problems  if tabs are used as editor the source has mixture of like with  leading spaces and tabs and Python interpreter might have different settings for tabs. And that can screw nesting of statements.  This also created problems with diffs and putting patches with patch.  Which simply means that tabs should never be  use while writing Python code.

Multiline statements in Python are iether detected by unbalanced brackets or can be explicitly marked with a backslash. Which is much better solution then mandatory semicolon at the and of the statements in Perl. You can omit is before '}', but  in all other cases the interpreter will complain

So while Perl carry on this design blunder form PL/1 and C (which is fact is a PL/1 derivative) Python managed to find a better, more optimal for programmers, solution.  Which especially benefits novices, but still is useful for seasoned Perl programmers. I do not know how many hours of my time were wasted on finding and correcting this type of persistent errors.

The ability to close multiple blocks by just changing indentation is another big  plus (actually PL/1 was cable of doing that, but neither C, nor Perl can), but  only for short programs, which are visible on  the screen.

At the  same time the possibility of mixing tabs and blanks for indentation is a horror show. You need specifically check if your Python program accidentally contain tabs and convert them to blanks. 

By relegating block brackets to the lexical level of blank space and comments Python failed to make a necessary adjustment and include pretty printer into interpreter (with the possibility of creating pretty printed program from pseudo-comments like #fi #while, etc ). Such a pretty printer actually is  needed to understand larger Python programs: format of comments can be Algol-68 style (or Bash style if you wish for people who never heard of Algol-68; that's the only feature that Borne shell inherited from algol-68 and Bourne was a member of Algol-68 development team before writing this programt ;-). For example:

if a > b :
   delta=b-a
#fi

and the current number of spaces in the tab like  pragma tab = 4. The interesting possibility is that in pretty printed program those comments can be removed and after a reading pretty printed program into the editor reinstated automatically. Such feature can be implemented in any scriptable editor.

My impression is that few people understand that C  solution for blocks ({ } blocks) was pretty weak in comparison with its prototype language (PL/1): it does not permit nice labeled closer of blocks like

A1:do 
   ... 
end A1; 

in PL/1. IMHO introduction of a pretty printer as a standard feature of both a compiler and a GUI environment is long overdue and here Python can make its mark.

By adding indentation as the proxy for nesting Python actually encourages a programmer to use a decent editor (which mean not nano/notepad family ;-), but we knew that already, right? 

This design decision also  narrows down the possible range of coding styles and  automatically leads to more compact (as for the number of lexical tokens) programs (deletion of curly brackets usually help to lessen the number of lines in C or Perl program by 20% or more). 

Difficulties of adapting to the language for Perl programmers

Although Python as a  scripting language used Perl as prototype and its features are generally similar to Perl, Perl programmers experience difficulties adapting to the language.  They're not overly similar in implementation details, nor even remotely similar in syntax. Their extension mechanisms also took different directions. 

Processing every line in a file

Perl:

while (<>) {
    print $_;
}

Perl version allows getting input from STDIN, or from arbitrary number of arguments (which are processes one by one). this actually allow inlisit concatenation of arbutraty number of file supplied as the parameters.

Python (not exact equivalent but covers the most common case when filename was supplied explicitly via an argument)

for line in file('filename.txt'):
    print line

Some function are iether redefined or are missing in Python. for example substr is missing as Python allow treating string as a n array of characters (like C) and array notation extended to sliced is applicable). Index function exists in Python is multiple incarnations. One is in keyword in loops and the other is index method for sting class. Which is more confusing than in Perl ;-) Python actually allow to limit index to a specific substring, while Perl allows only to specify the starting position of search.

mystring = 'abcdefghijklmnoprstu'
print mystring.index('b',1,5)
Python has less transparent method of getting output of Unix standard utility or filter into the script:
import subprocess
df = subprocess.Popen(["df", "filename"], stdout=subprocess.PIPE)
output = df.communicate()[0]
device, size, used, available, percent, mountpoint = output.split("\n")[1].split()

While Perl has q, qq string constants and HERE string, Python has some nice features too although it does not match the Perl power:

Adjacent literal strings are concatenated by the Python compiler into a single string:

>>> "first" "second"
'firstsecond'

That can ve very useful in slplitting long string to get a nicely formatting code. Perl interpret operator . between string constact at compile time, so it has similar fuathre without introducing asstional notation (which makes Python more non-orthogonal language as _ operator does the same thing and can be optimized at compile time)

First, let's look at multiline strings. Multiline strings are delimited by three quote characters rather than one. Here's an example using three double-quotes:

>>> """This is
... a multiline
... string"""
'This is\na multiline\nstring'

 

Final notes

We all understand that in real life better language seldom win (look at Java).  Luck plays tremendous role in  determining languages popularity. Best commercially supported language that satisfy current fashion has better chances. Python managed  to ride the wave of enthusiasm toward OO programming, which (by-and-large) unfair relegated Perl to the second class languages.  And it is not a bad scripting language so in a way success of Python is our success too.

It is also a very sad event, when the language you learned and used for a decade of more fades and you need to learn yet another one, just to keep up with programming fashion.

But there is a silver lining in any dark cloud. Using better supported language allow you chose a better IDE and other tools and as such you will be partially complicated for loss of time caused by the necessary to relearn and switch tot he new language.  Also to a certain extent learning Python makes you better appreciate Perl ;-)

For sysadmins that's especially difficult question to answer adhere much depends on your individual situation. It you write programs mostly to yourself and do not need to support Python programmers you probably are better off staying with Perl. If you need to support Python programmer with all this pip troubles, you better start learning Python and make it your primary language.

Python now also has several different implementation of the interpreter, which are a clear sign of both popularity and maturity of the language:  Along with CPython interpreter (which is standard) there is quite popular Jython  which uses JVM and thus integrates well with Java, and Iron Python which is Microsoft implementation (Python -- programming language)

The mainstream Python implementation, also known as CPython, is written in C compliant to the C89 standard, and is distributed with a large standard library written in a mixture of C and Python. CPython ships for a large number of supported platforms, including Microsoft Windows and most modern Unix-like systems. CPython was intended from almost its very conception to be cross-platform; its use and development on esoteric platforms such as Amoeba alongside more conventional ones like Unix or Macintosh has greatly helped in this regard.

Stackless Python is a significant fork of CPython that implements microthreads. It can be expected to run on approximately the same platforms that CPython runs on.

There are two other major implementations: Jython for the Java platform, and IronPython for the .NET platform. PyPy is an experimental self-hosting implementation of Python, in Python, that can output a variety of types of bytecode, object code and intermediate languages.

Several programs exist to package Python programs into standalone executables, including py2exe, PyInstaller, cx_Freeze and py2app.

Many Python programs can run on different Python implementations, on such disparate operating systems and execution environments, without change. In the case of the implementations running on top of the Java virtual machine or the Common Language Runtime, the platform-independence of these systems is harnessed by their respective Python implementation.

Many third-party libraries for Python (and even some first-party ones) are only available on Windows, Linux, BSD, and Mac OS X.

There is also a dialect called Stackless Python which adds support for coroutines, communication channels and task serialization.  As of 2019 this distribution is still maintained.

Python also has better interface with C programs than Perl, and more easily allow to write extension modules in C.

Nikolai Bezroukov


Top Visited
Switchboard
Latest
Past week
Past month

NEWS CONTENTS

Old News ;-)

[Oct 22, 2019] Is Python call-by-value or call-by-reference Neither

This author does not understand the concept of references.
And while the article does not explain he concept right, some comments to it are valuable. As Florian Bosch noted: "The proper terminology is that everything in python is a reference, period."
Oct 22, 2019 | jeffknupp.com

One aspect of Python programming that trips up those coming from languages like C or Java is how arguments are passed to functions in Python. At a more fundamental level, the confusion arises from a misunderstanding about Python object-centric data model and its treatment of assignment. When asked whether Python function calling model is "call-by-value" or "call-by-reference", the correct answer is: neither . Indeed, to try to shoe-horn those terms into a conversation about Python's model is misguided. "call-by-object," or "call-by-object-reference" is a more accurate way of describing it. But what does "call-by-object" even mean?

In Python, (almost) everything is an object. What we commonly refer to as "variables" in Python are more properly called names . Likewise, "assignment" is really the binding of a name to an object . Each binding has a scope that defines its visibility, usually the block in which the name originates.

That's a lot of terminology all at once, but those basic terms form the cornerstone of Python's execution model . Compared to, say, C++, the differences are subtle yet important. A concrete example will highlight these differences. Think about what happens when the following C++ code is executed:

Tom Rees 7 years ago ,

Please don't muddy the waters with confusing articles like this. The Java community (and others) go round in circles trying to straighten this out, but the confusion will never die. Python and Java are entirely call-by-value; call-by-value has a specific (relatively ancient) technical definition; please don't try to overload the term or update its definition. Discussion of reference semantics is interesting, but trying to use or redefine these terms is not: They are only useful in discussions involving languages such as C or Pascal.

Jamie Marshall Brando Miranda a year ago ,

This is completely wrong. It's just a misunderstanding of when assignments happen. In the end computer memory, despite what everyone says only works in one of two ways, by reference, or by value. Anything else you think might be happening is just you misunderstanding the time at which an assignment is declared by reference or by value.

ich • 7 years ago ,

What? This is just call-by-value, where all values are object pointers. It's the same as C#.

Edward Rich 5 years ago ,

Well, "call-by-value where the value is the pointer (address) of an object" is one way to define call-by-reference.
http://dictionary.reference...

Andrey Upadyshev Edward 2 years ago • edited ,

Not really. Reference (in C++ world at least) is something that can't be rebound from one object to another and that can't be bound to a no object (nullptr or None), while with a pointer the both operations are possible, so Python is definitely a call-by-pointer approach (or call-by-value where value is a pointer).

mdeboard 7 years ago ,

This article is (inadvertently, I think) an excellent explanation of scope & state in Python. I'm not sure the headline holds up but good article.

Jeffrey mdeboard 5 years ago ,

I did learn more about scope reading this, unexpectadely.

Daniel DeCovnick 7 years ago • edited ,

The last two code snippets muddy the waters badly, because you're not changing the object that's passed as the parameter in the first example. The *contents* of the object are changing, but not the object itself. Consider for the first example instead (or as a third example):

def foo(bar):
bar = []
bar.append(42)
print(bar)
# >> [42]

answer_list = []
foo(answer_list)
print(answer_list)
# >> []

So yes, you really can say that Python is call-by-value, where all values are object references. Just like in C#, Java, and numerous other languages. C++ can do the same thing as long as you avoid stack objects and stick to pointers*, and Objective-C does precisely that, but enforcing it so that you can't create a stack object (C primitives and structs and Objective-C++ notwithstanding). In all those cases, the function call model is identical to Python.

*C++ additionally has true pass-by-reference semantics available using reference parameters, but that's not relevant at the moment.

Florian Bösch 7 years ago ,

The proper terminology is that everything in python is a reference, period. It's btw. the same as for Ruby, Javascript, Perl, Lua, Dart etc. The actual notable exception from this semantic is PHP which choose to implement variables in some extremely awkward, brittle and error prone way that are neither values nor references and can carry side effects beyond their scope.

ngm_disqus 7 years ago ,

Nice post. I think "call by object reference" is a good name. It is basically "call by value," except that you're ONLY allowed to pass object references as your values. It is illustrative to compare this to C++, which has call by value [void f(int x)] and call by (non-object) reference [void f(int& x)]. C# also has call by value [void f(int x)] and call by (non-object) reference [void f(ref int x), void f(out x)]. Both languages let you pass non-references as your params. I guess Java has call by value (primitives) and call by object reference (objects). I believe C only has call by value. If you want reference semantics, then your value must be a pointer.

I guess Python docs are a bit fast-and-loose with the "immutable" label. I guess the object needs to be immutable "all the way down" before we can reason about it as a value.

Markus Zhang ngm_disqus 7 months ago ,

I think it's also because of the mutable/immutable thing that confuses people.

Tim • 6 years ago ,

I think this article would be a lot better if you stated what you thought the meanings of "pass by reference" and "pass by value" were. It might be that you have an inconsistent definition of those from other people (because all the examples you provided fit pass by value under my definition, where all the values are pointers to other data).

Name • 5 years ago ,

OMG!!!! This article is absurd! There are two (and only two) kinds of things that we can store in memory: DATA and the ADDRESS where that data is stored. When we pass DATA through the stack we call this "pass by VALUE". When we pass the ADDRESS of the DATA through the stack we call this "pass by REFERENCE". There is no such thing of NEITHER. It's a pitty that all people can post articles in the net: those who write RIGHT posts and those who write WRONG ones.

Jeffrey Name 5 years ago ,

Traceback (most recent call last):
File "<stdin>", line 5, in <comment>
NameError: name 'pitty' is not defined

Cosimo Zecchi Name 2 years ago ,

You are right if you refer to computer science in general. But programming languages have different levels of abstraction. There are low level languages and high level languages, so I think that if you refer to a language it's correct to evaluate what the language is doing in his level.
In C you manage references and address directly. In python (but also in JS as I can see) the language do it for you and the article explains you that in several situations it acts like pass by value (although in the very end it's not true but that's the final behaviour).
Maybe could be interesting seeing how python manage the memory and the garbage collector.

Anonymous Name 2 years ago ,

in python the value is the reference to an object; so yea you don't pass neither by value and neither by reference; google it before you try to be a smart fella.

Rovan Anonymous 2 years ago ,

It doesn't matter what the value is. The truth is function calling model in Python is call-by-value. That's it.

Jeffrey Name 5 years ago ,

Despite my trolling, you're absolutely right. Just, you know, calm down. The net will be the net afterall...

tarun bansal 5 years ago ,

After one year of programming in python learnt about this concept today only. Explained beautifully, but shocking to see terms like "muddy the waters" in comments.

Jeffrey tarun bansal 5 years ago ,

Learning is walking through a marsh to find out that you're in a marsh. I don't think the commenter realizes how muddy the waters really are. This post is an attempt to provide a semantical explanation in terms that Python users and beginners can understand. I think he did a great job, because after reading more about references vs values I understood it more because of THIS post's breakdown.

spockout • 7 years ago ,

Very informative and well written. Thank you for making the effort!

Guest • 7 years ago ,

I believe this is call-by-value, where variables are references. A function call copies the references into local variables. These references can point to either mutable or immutable things. Additionally, these mutable or immutable things can contain references to other mutable or immutable things. I think this covers all cases. Can anyone think of a counter example?

qznc • 7 years ago ,

Come again. Why is it not call-by-reference?

Swami • 5 years ago ,

This article is very well-written. You threw out the naming conventions and truly conveyed how Python is different.

Oxygen Yogi 2 years ago ,

So, the reference gets copied by value.

rickoshay 2 years ago ,

If nothing else you've confirmed that folks can get hopelessly confused about a rather simple construct, so in that sense this long rambling and uninformed drivel serves a purpose.

Java is always call-by-value, it's just that the value is a reference so it feels like call by reference. C++ lets you pass values like 42 or a whole object, where the whole object is copied; facilitating true call by value.

A true call by reference, on the other hand, means values are never copied ever: not for true, 42 or some object. The names are shared identities. Ignore that and just refer to Java and C++ semantics for clarity.

ngm_disqus 7 years ago ,

Nice post. I think "call by object reference" is a good name. It is basically "call by value," except that you're ONLY allowed to pass object references as your values. It is illustrative to compare this to C++, which has call by value [void f(int x)] and call by (non-object) reference [void f(int& x)]. C# also has call by value [void f(int x)] and call by (non-object) reference [void f(ref int x), void f(out x)]. Both languages let you pass non-references as your params. I guess Java has call by value (primitives) and call by object reference (objects). I believe C only has call by value. If you want reference semantics, then your value must be a pointer.

I guess Python docs are a bit fast-and-loose with the "immutable" label. I guess the object needs to be immutable "all the way down" before we can reason about it as a value.

Jeffrey ngm_disqus 5 years ago ,

This explanation was a bit too advanced for me to understand fully at the moment, but you're still providing very good insight into and answering my question. Thank you.

revskill 7 years ago ,

Thank you very much for the article. I think this is the first think to learn when learning a new programming language.

orclev 7 years ago ,

so... it's call by reference. The only gotcha is that all 'variables' are references, which as pointed out by Florian is pretty standard in most languages (E.G. C#, Java, etc.). In other words assignment to a variable merely creates a new reference, it doesn't mutate the existing value.

SirClueless orclev 7 years ago ,

It's definitely *not* call-by-reference. Call-by-reference means that if I call foo(bar), then the value of bar can be changed by foo, which is not true.

It is call-by-value, where every value is an object reference.

John SirClueless 7 years ago ,

If you try this code:

>>> a = []
>>> def foo(b):
... b.append('a')
...

>>> foo(a)
>>> a
['a']

>>> foo(a)
>>> a
['a', 'a']

You will see that if you call foo(bar) you are able to change the value of bar.

theq629 John 7 years ago ,

No, you can mutate the object it is bound to and you can change the value of the parameter b inside the function. You can't change what bar is bound to.

Markus Zhang 7 months ago • edited ,

I really enjoy this article as it is the only one that uses "binding" and "object". I think the key is to understand two things: 1) Assignment in Python is actually Binding (I'd love to know what exactly binding is, but I think it's similar to adding a reference to a piece of memory in C++, except that in Python you can bind a name to any type of variables); 2) Some object are mutable, i.e. you can change it anywhere, even in different scopes and every name bidden to it is also aware of the change.

However, there is one more thing that I feel murky. What happens when I pass parameters to and from functions? Maybe instead of passing parameters, we are actually passing objects (only sort of), as a new name pointed to the same object is automatically created whenever the object is passed, and this is even true when it is an empty function. Also, NEVER a new object is created in the procedure of parameter passing, and ONLY when assignment occurs a new object is created.

I think it's equally confusing about string functions. Strings are immutable, however you can call functions to modify it. But actually each time you modify it, you just create a new string. This is very confusing to beginners.

shailesh pratap a year ago • edited ,

Really liked your article. Not going by what other feels, may be they have a better understanding of how things works in other programming languages but definitely learnt from how variable assignment works with respect to python. Keep doing your good work.

oeyh • a year ago ,

Great article, thanks! But I feel that the third example (the one containing 'another_list_of_names') does not illustrate "An immutable object does not exhibit time-varying behavior" very well. An example like this will do:
a = 'string'
b = a
a += 's'
print(a)
print(b)

a becomes 'strings', but value of b is still 'string'.

Tristan Yan a year ago • edited ,

This is totally wrong, this is not related to pass by value or pass by reference, this is related to semantic of constructor.
Like in C++, when you do
A a = b;

a couple of things could happens, depends on do you have copy constructor, assignment constructor. In other language like Python or Java, it doesn't have this concept, so it uses the same reference.

Tristan Yan Tristan Yan a year ago ,

BTW, when people are talking about copy by value or copy by reference, people refer to how the argument will be handled by calling a function, C++ had special type reference, so it has the difference with call by reference or by value, typically by value is by default, because it's safer. But it could waste of memory and affect performance especially the object is huge.

stevenlehar 2 years ago ,

The simplest explanation is the simplest demo:

a = 5
b = a
a,b => (5,5)

a=6
a,b => (6,5)

And now with lists:
a = [5]
b = a
a,b => ([5], [5])

a = [6]
a,b => ([6], [6])

a and b point to the same object.

Enda O'Brien stevenlehar 9 months ago ,

So why does my interpreter generate different results for the lists?

>>> a = [5]
>>> b = a
>>> a,b
([5], [5])
>>> a is b
True
>>> a = [6]
>>> a,b
([6], [5])
>>> a is b
False

stevenlehar Enda O'Brien 9 months ago ,

Oops! My bad. Not a=[6], because that defines a whole new list. Instead, do a[0]=6, to change the value in a without starting a new list.

Enda O'Brien stevenlehar 9 months ago ,

Okay, fair enough, thank you!

kapil 2 years ago ,

Thanks for the explanation. Could you please help me with a doubt I encountered while verifying the concepts.

def fuc(bar):
bar=bar*2

al=[1,2,3,4]
fuc(al)
print(al)

Here the output shows the unchanged old list.
Did the operation "bar*2" create another object ? I was thinking I modified the same object.

Markus Zhang kapil 7 months ago ,

Yes if you id(bar) before passing it to fuc() and inside of fuc() you are going to see two different virtual addresses. However, if you write bar=bar , then they are of the same address.

niranjan patidar kapil 9 months ago ,

########### Example 1 ##############
def fuc1(bar1):
b1 = bar1 # just for checking reference
bar1.append(bar*2)
print(b1 is bar1) # check if both are pointing same object -- True

# call func
bar=[1,2,3,4]
fuc1(bar)
print(bar)

########### Example 2 ##############
def fuc(bar):
b2 = bar # just for checking reference
bar=bar*2 # Now here reference changes, hence changes are not reflected
print(b2 is bar)
bar.append(bar*2) # check if both are pointing same object -- False

al=[1,2,3,4]
fuc(al)
print(al)

Onereason09 • 2 years ago ,

I really do appreciate this article. I was exposed to different programming languages (among which C,C++ and Java) and for me your explanations really do make sense. Thank you!

Colathur Vijayan 2 years ago ,

Truly Awesome and well written. Jeff''s articles on Python are insigntful, expository and make complex things simple (without losing conceptual rigor).

Levent Atan 2 years ago ,

I was thinking Python has the notion of pass-by-value exactly like Java. However, Java creates a separate memory location for each of the arguments in the stack where python only binds the name to that location.

Guest • 7 years ago ,

Nice post. I think the name for this is "call by reference." It is basically "call by value," except that you're ONLY allowed to pass object references as your values. It is illustrative to compare this to C++, which has call by value [void f(int x)] and call by reference [void f(int& x)]. C# also has call by value [void f(int x)] and call by reference [void f(ref int x), void f(out x)]. Both languages let you pass non-references as your params. I guess Java has call by value (primitives) and call by reference (objects). I believe C only has call by value. If you want reference semantics, then your value must be a pointer.

I guess Python docs are a bit fast-and-loose with the "immutable" label. I guess the object needs to be immutable "all the way down" before we can reason about it as a value.

Jeffrey Guest 5 years ago ,

padding?

Mark Lewis 9 months ago ,

The semantics discussed in this article are identical to Java, C#, and a plethora of other languages. All variables are references to objects in the language and those references are passed by value. As Tom Rees pointed out six year ago, this is just muddying the waters.

Where this post is really wrong, not just obfuscating things, is with statements like "Each binding has a scope that defines its visibility, usually the block in which the name originates." This implies that Python has block scope. It doesn't. It has global scope and function scope. Consider the following code:

>>> def foo(a):
... if a==0:
... b=5
... print(b)
...
>>> foo(0)
5

In a language with block scope, b only exists inside the if, making this an error in all invocations (as I would argue it should be). However, Python doesn't have block scope, it has function scope, so a variable used anywhere in a function, regardless of the scope, exists at all points in the function.

The more interesting question is, what is the scope of arguments to functions. I had always assumed they had function scope as well. It might not be what I like, but at least it would be logical. Then I saw the following example:

>>> def bar(a, b=[]):
... b.append(a)
... return b
...
>>> bar(5)
[5]
>>> bar(6)
[5, 6]

This behavior is just messed up because clearly b lives globally to be remembered between calls, so it doesn't have function scope. But wait, it gets worse. Consider this minor change:

>>> def bar(a, b={}):
... if b=={}:
... b=[]
... b.append(a)
... return b
...
>>> bar(5)
[5]
>>> bar(6)
[6]

Now the value of b isn't remembered between calls and is being re-initialized on each call. WAT?!?

1
2
3

[Oct 22, 2019] Difference in regex behavior between Perl and Python?

Oct 22, 2019 | stackoverflow.com

Ask Question Asked 10 years, 6 months ago Active 10 years, 6 months ago Viewed 2k times 3 1


Gumbo ,Apr 16, 2009 at 18:42

I have a couple email addresses, 'support@company.com' and '1234567@tickets.company.com' .

In perl, I could take the To: line of a raw email and find either of the above addresses with

/\w+@(tickets\.)?company\.com/i

In python, I simply wrote the above regex as '\w+@(tickets\.)?company\.com' expecting the same result. However, support@company.com isn't found at all and a findall on the second returns a list containing only 'tickets.' . So clearly the '(tickets\.)?' is the problem area, but what exactly is the difference in regular expression rules between Perl and Python that I'm missing?

Axeman ,Apr 16, 2009 at 21:10

The documentation for re.findall :
findall(pattern, string, flags=0)
    Return a list of all non-overlapping matches in the string.

    If one or more groups are present in the pattern, return a
    list of groups; this will be a list of tuples if the pattern
    has more than one group.

    Empty matches are included in the result.

Since (tickets\.) is a group, findall returns that instead of the whole match. If you want the whole match, put a group around the whole pattern and/or use non-grouping matches, i.e.

r'(\w+@(tickets\.)?company\.com)'
r'\w+@(?:tickets\.)?company\.com'

Note that you'll have to pick out the first element of each tuple returned by findall in the first case.

chaos ,Apr 16, 2009 at 18:45

I think the problem is in your expectations of extracted values. Try using this in your current Python code:
'(\w+@(?:tickets\.)?company\.com)'

Jason Coon ,Apr 16, 2009 at 18:46

Two problems jump out at me:
  1. You need to use a raw string to avoid having to escape " \ "
  2. You need to escape " . "

So try:

r'\w+@(tickets\.)?company\.com'

EDIT

Sample output:

>>> import re
>>> exp = re.compile(r'\w+@(tickets\.)?company\.com')
>>> bool(exp.match("s@company.com"))
True
>>> bool(exp.match("1234567@tickets.company.com"))
True

,

There isn't a difference in the regexes, but there is a difference in what you are looking for. Your regex is capturing only "tickets." if it exists in both regexes. You probably want something like this
#!/usr/bin/python

import re

regex = re.compile("(\w+@(?:tickets\.)?company\.com)");

a = [
    "foo@company.com", 
    "foo@tickets.company.com", 
    "foo@ticketsacompany.com",
    "foo@compant.org"
];

for string in a:
    print regex.findall(string)

[Oct 21, 2019] Python variable reference assignment

Oct 21, 2019 | stackoverflow.com

Python variable reference assignment Ask Question Asked 7 years, 3 months ago Active 3 years, 11 months ago Viewed 54k times 42 20


viji ,Jun 27, 2012 at 8:43

In the code
y = 7
x = y    
x = 8

Now, y will be 7 and x will be 8. But actually I wanna change y. Can I assign the reference of y and do that ?

For example, in C++ the same thing can be achieved as,

int y = 8;
int &x = y;
x = 9;

Now both y & x will be 9

Daniel Roseman ,Jun 27, 2012 at 9:20

No, you cannot. As other answer point out, you can (ab?)use aliasing of mutable objects to achieve a similar effect. However, that's not the same thing as C++ references, and I want to explain what actually happens to avoid any misconceptions.

You see, in C++ (and other languages), a variable (and object fields, and entries in collections, etc.) is a storage location and you write a value (for instance, an integer, an object, or a pointer) to that location. In this model, references are an alias for a storage location (of any kind) - when you assign to a non-reference variable, you copy a value (even if it's just a pointer, it's still a value) to the storage location; when you assign to a reference, you copy to a storage location somewhere else. Note that you cannot change a reference itself - once it is bound (and it has to as soon as you create one) all assignments to it alter not the reference but whatever is referred to.

In Python (and other languages), a variable (and object fields, and entries in collections, etc.) is a just a name. Values are somewhere else (e.g. sprinkled all over the heap), and a variable refers (not in the sense of C++ references, more like a pointer minus the pointer arithmetic) to a value. Multiple names can refer to the same value (which is generally a good thing). Python (and other languages) calls whatever is needed to refer to a value a reference, despite being pretty unrelated to things like C++ references and pass-by-reference. Assigning to a variable (or object field, or ...) simply makes it refer to another value. The whole model of storage locations does not apply to Python, the programmer never handles storage locations for values. All he stores and shuffles around are Python references, and those are not values in Python, so they cannot be target of other Python references.

All of this is independent of mutability of the value - it's the same for ints and lists, for instance. You cannot take a variable that refers to either, and overwrite the object it points to. You can only tell the object to modify parts of itself - say, change some reference it contains.

Is this a more restrictive model? Perhaps, but it's powerful enough most of the time. And when it isn't you can work around it, either with a custom class like the one given below, or (equivalent, but less obvious) a single-element collection.

class Reference:
    def __init__(self, val):
        self._value = val # just refers to val, no copy

    def get(self):
        return self._value

    def set(self, val):
        self._value = val

That still won't allow you to alias a "regular" variable or object field, but you can have multiple variables referring to the same Reference object (ditto for the mutable-singleton-collection alternative). You just have to be careful to always use .get() / .set() (or [0] ).

rbaleksandar ,Sep 24, 2013 at 19:34

No, Python doesn't have this feature.

If you had a list (or any other mutable object) you could do what you want by mutating the object that both x and y are bound to:

>>> x = [7]
>>> y = x
>>> y[0] = 8
>>> print x
[8]

See it working online: ideone

tauseef_CuriousGuy ,Sep 7, 2017 at 8:10

You should use a mutable object for this.

In python x & y are just references to objects so y = 7 means y points to the object 7 . x=y means x too points to 7 , but as 7 is immutable so changing the value of x simply changes the object 7 and y still remains pointing to 7 .

>>> y = [7]
>>> x = y
>>> x[0] = 8 # here you're changing [7] not x or y, x & y are just references to [7]
>>> y
[8]

,

Alternatively, you could use a self crafted container.
class Value(object):
    def __init__(self, value): self.value = value

y = Value(7)
x = y
x.value = 8
print y.value

[Oct 16, 2019] Python Variable Declaration - Stack Overflow

Oct 16, 2019 | stackoverflow.com

Python Variable Declaration Ask Question Asked 7 years, 4 months ago Active 20 days ago Viewed 249k times 73 48


jpaugh ,Feb 28, 2017 at 23:52

Learning Python , and has some basic doubts.

1.I have seen variable declaration (path here) as

class writer:
    path = ""

sometimes, no explicit declaration but initialize through __init__ .

def __init__(self, name):
    self.name = name

I understand the purpose of __init__ , but is it advisable to declare variable in any other functions.

2.How can I create variable to hold a custom type?

class writer:
    path = "" # string value
    customObj = ??

Martijn Pieters ♦ ,Mar 9 at 3:52

Okay, first things first.

There is no such thing as "variable declaration" or "variable initialization" in Python.

There is simply what we call "assignment", but should probably just call "naming".

Assignment means "this name on the left-hand side now refers to the result of evaluating the right-hand side, regardless of what it referred to before (if anything)".

foo = 'bar' # the name 'foo' is now a name for the string 'bar'
foo = 2 * 3 # the name 'foo' stops being a name for the string 'bar',
# and starts being a name for the integer 6, resulting from the multiplication

As such, Python's names (a better term than "variables", arguably) don't have associated types; the values do. You can re-apply the same name to anything regardless of its type, but the thing still has behaviour that's dependent upon its type. The name is simply a way to refer to the value (object). This answers your second question: You don't create variables to hold a custom type. You don't create variables to hold any particular type. You don't "create" variables at all. You give names to objects.

Second point: Python follows a very simple rule when it comes to classes, that is actually much more consistent than what languages like Java, C++ and C# do: everything declared inside the class block is part of the class . So, functions ( def ) written here are methods, i.e. part of the class object (not stored on a per-instance basis), just like in Java, C++ and C#; but other names here are also part of the class. Again, the names are just names, and they don't have associated types, and functions are objects too in Python. Thus:

class Example:
    data = 42
    def method(self): pass

Classes are objects too , in Python.

So now we have created an object named Example , which represents the class of all things that are Example s. This object has two user-supplied attributes (In C++, "members"; in C#, "fields or properties or methods"; in Java, "fields or methods"). One of them is named data , and it stores the integer value 42 . The other is named method , and it stores a function object. (There are several more attributes that Python adds automatically.)

These attributes still aren't really part of the object, though. Fundamentally, an object is just a bundle of more names (the attribute names), until you get down to things that can't be divided up any more. Thus, values can be shared between different instances of a class, or even between objects of different classes, if you deliberately set that up.

Let's create an instance:

x = Example()

Now we have a separate object named x , which is an instance of Example . The data and method are not actually part of the object, but we can still look them up via x because of some magic that Python does behind the scenes. When we look up method , in particular, we will instead get a "bound method" (when we call it, x gets passed automatically as the self parameter, which cannot happen if we look up Example.method directly).

What happens when we try to use x.data ?

When we examine it, it's looked up in the object first. If it's not found in the object, Python looks in the class.

However, when we assign to x.data , Python will create an attribute on the object. It will not replace the class' attribute.

This allows us to do object initialization. Python will automatically call the class' __init__ method on new instances when they are created, if present. In this method, we can simply assign to attributes to set initial values for that attribute on each object:

class Example:
    name = "Ignored"
    def __init__(self, name):
        self.name = name
    # rest as before

Now we must specify a name when we create an Example , and each instance has its own name . Python will ignore the class attribute Example.name whenever we look up the .name of an instance, because the instance's attribute will be found first.

One last caveat: modification (mutation) and assignment are different things!

In Python, strings are immutable. They cannot be modified. When you do:

a = 'hi '
b = a
a += 'mom'

You do not change the original 'hi ' string. That is impossible in Python. Instead, you create a new string 'hi mom' , and cause a to stop being a name for 'hi ' , and start being a name for 'hi mom' instead. We made b a name for 'hi ' as well, and after re-applying the a name, b is still a name for 'hi ' , because 'hi ' still exists and has not been changed.

But lists can be changed:

a = [1, 2, 3]
b = a
a += [4]

Now b is [1, 2, 3, 4] as well, because we made b a name for the same thing that a named, and then we changed that thing. We did not create a new list for a to name, because Python simply treats += differently for lists.

This matters for objects because if you had a list as a class attribute, and used an instance to modify the list, then the change would be "seen" in all other instances. This is because (a) the data is actually part of the class object, and not any instance object; (b) because you were modifying the list and not doing a simple assignment, you did not create a new instance attribute hiding the class attribute.

detly ,Jun 13, 2012 at 3:38

There's no need to declare new variables in Python. If we're talking about variables in functions or modules, no declaration is needed. Just assign a value to a name where you need it: mymagic = "Magic" . Variables in Python can hold values of any type, and you can't restrict that.

Your question specifically asks about classes, objects and instance variables though. The idiomatic way to create instance variables is in the __init__ method and nowhere else -- while you could create new instance variables in other methods, or even in unrelated code, it's just a bad idea. It'll make your code hard to reason about or to maintain.

So for example:

class Thing(object):

    def __init__(self, magic):
        self.magic = magic

Easy. Now instances of this class have a magic attribute:

thingo = Thing("More magic")
# thingo.magic is now "More magic"

Creating variables in the namespace of the class itself leads to different behaviour altogether. It is functionally different, and you should only do it if you have a specific reason to. For example:

class Thing(object):

    magic = "Magic"

    def __init__(self):
        pass

Now try:

thingo = Thing()
Thing.magic = 1
# thingo.magic is now 1

Or:

class Thing(object):

    magic = ["More", "magic"]

    def __init__(self):
        pass

thing1 = Thing()
thing2 = Thing()
thing1.magic.append("here")
# thing1.magic AND thing2.magic is now ["More", "magic", "here"]

This is because the namespace of the class itself is different to the namespace of the objects created from it. I'll leave it to you to research that a bit more.

The take-home message is that idiomatic Python is to (a) initialise object attributes in your __init__ method, and (b) document the behaviour of your class as needed. You don't need to go to the trouble of full-blown Sphinx-level documentation for everything you ever write, but at least some comments about whatever details you or someone else might need to pick it up.

O. Aroesti ,Jul 4, 2018 at 14:35

This might be 6 years late, but in Python 3.5 and above, you declare a variable type like this:
variable_name: type_name

or this:

variable_name # type: shinyType

So in your case(if you have a CustomObject class defined), you can do:

customObj: CustomObject

See this or that for more info.

redhot ,Jun 22, 2017 at 15:29

For scoping purpose, I use:
custom_object = None

,

Variables have scope, so yes it is appropriate to have variables that are specific to your function. You don't always have to be explicit about their definition; usually you can just use them. Only if you want to do something specific to the type of the variable, like append for a list, do you need to define them before you start using them. Typical example of this.
list = []
for i in stuff:
  list.append(i)

By the way, this is not really a good way to setup the list. It would be better to say:

list = [i for i in stuff] # list comprehension

...but I digress.

Your other question. The custom object should be a class itself.

class CustomObject(): # always capitalize the class name...this is not syntax, just style.
  pass
customObj = CustomObject()

[Oct 14, 2019] Python Strings, Functions and Examples by Meenakshi Agarwal

Oct 14, 2019 | www.techbeamers.com
How to Create Strings in Python? Creating strings is easy as you only need to enclose the characters either in single or double-quotes. In the following example, we are providing different ways to initialize strings. To share an important note that you can also use triple quotes to create strings. However, programmers use them to mark multi-line strings and docstrings. Creating strings is easy as you only need to enclose the characters either in single or double-quotes. In the following example, we are providing different ways to initialize strings. To share an important note that you can also use triple quotes to create strings. However, programmers use them to mark multi-line strings and docstrings. In the following example, we are providing different ways to initialize strings. To share an important note that you can also use triple quotes to create strings. However, programmers use them to mark multi-line strings and docstrings. In the following example, we are providing different ways to initialize strings. To share an important note that you can also use triple quotes to create strings. However, programmers use them to mark multi-line strings and docstrings. To share an important note that you can also use triple quotes to create strings. However, programmers use them to mark multi-line strings and docstrings. To share an important note that you can also use triple quotes to create strings. However, programmers use them to mark multi-line strings and docstrings.
# Python string examples - all assignments are identical.
String_var = 'Python'
String_var = "Python"
String_var = """Python"""

# with Triple quotes Strings can extend to multiple lines
String_var = """ This document will help you to
explore all the concepts
of Python Strings!!! """

# Replace "document" with "tutorial" and store in another variable
substr_var = String_var.replace("document", "tutorial")
print (substr_var)

Index and Slice Strings in Python

Access Individual Characters of a String You need to know the index of a character to retrieve it from the String. Like the most programming languages, Python allows to index from the zeroth position in Strings. But it also supports negative indexes. Index of '-1' represents the last character of the String. Similarly using '-2', we can access the penultimate element of the string and so on. You need to know the index of a character to retrieve it from the String. Like the most programming languages, Python allows to index from the zeroth position in Strings. But it also supports negative indexes. Index of '-1' represents the last character of the String. Similarly using '-2', we can access the penultimate element of the string and so on. Like the most programming languages, Python allows to index from the zeroth position in Strings. But it also supports negative indexes. Index of '-1' represents the last character of the String. Similarly using '-2', we can access the penultimate element of the string and so on. Like the most programming languages, Python allows to index from the zeroth position in Strings. But it also supports negative indexes. Index of '-1' represents the last character of the String. Similarly using '-2', we can access the penultimate element of the string and so on.

sample_str = 'Python String'

print (sample_str[0])       # return 1st character
# output: P

print (sample_str[-1])      # return last character
# output: g

print (sample_str[-2])      # return last second character
# output: n
Slice a String in Python To retrieve a range of characters in a String, we use 'slicing operator,' the colon ':' sign. With the slicing operator, we define the range as [a:b]. It'll let us print all the characters of the String starting from index 'a' up to char at index 'b-1'. So the char at index 'b' is not a part of the output. To retrieve a range of characters in a String, we use 'slicing operator,' the colon ':' sign. With the slicing operator, we define the range as [a:b]. It'll let us print all the characters of the String starting from index 'a' up to char at index 'b-1'. So the char at index 'b' is not a part of the output.
sample_str = 'Python String'
print (sample_str[3:5])     #return a range of character
# ho
print (sample_str[7:])      # return all characters from index 7
# String
print (sample_str[:6])      # return all characters before index 6
# Python
print (sample_str[7:-4])
# St
Next, we have a no. of Python tutorials/quizzes/interview questions on this blog. If you like to try them, then refer any of the posts listed below. Suggested Reading: ☛ Next, we have a no. of Python tutorials/quizzes/interview questions on this blog. If you like to try them, then refer any of the posts listed below. Suggested Reading: ☛ Suggested Reading: ☛ Suggested Reading: ☛ ☛ 100+ Python Interview Questions TOC Python Strings – Common Error Codes 1- If we try to retrieve characters at out of range index, then 'IndexError' exception will be raised. 1- If we try to retrieve characters at out of range index, then 'IndexError' exception will be raised.
sample_str = "Python Supports Machine Learning."
print (sample_str[1024])      #index must be in range

# IndexError: string index out of range
2- String index must be of the integer data type. You should not use a float or any other data type for this purpose. Otherwise, the Python subsystem will flag a TypeError exception as it detects a data type violation for the string index. 2- String index must be of the integer data type. You should not use a float or any other data type for this purpose. Otherwise, the Python subsystem will flag a TypeError exception as it detects a data type violation for the string index.
sample_str = "Welcome post"
print (sample_str[1.25])      #index must be an integer

# TypeError: string indices must be integers
Modify/Delete a String in Python Python Strings are by design immutable. It suggests that once a String binds to a variable; it can't be modified. If you want to update the String, then re-assign a new String value to the same variable. Python Strings are by design immutable. It suggests that once a String binds to a variable; it can't be modified. If you want to update the String, then re-assign a new String value to the same variable. If you want to update the String, then re-assign a new String value to the same variable. If you want to update the String, then re-assign a new String value to the same variable.
sample_str = 'Python String'
sample_str[2] = 'a'

# TypeError: 'str' object does not support item assignment

sample_str = 'Programming String'
print (sample_str)

# Output=> Programming String
Similarly, we cannot modify the Strings by deleting some characters from it. Instead, we can remove the Strings altogether by using the 'del' command. Similarly, we cannot modify the Strings by deleting some characters from it. Instead, we can remove the Strings altogether by using the 'del' command.
sample_str = "Python is the best scripting language."
del sample_str[1]
# TypeError: 'str' object doesn't support item deletion

del sample_str
print (sample_str)
# NameError: name 'sample_str' is not defined
Suggested Reading: ☛ Suggested Reading: ☛ ☛ Python Programming Interview Questions TOC String Operators in Python Concatenation (+) It combines two strings into one. It combines two strings into one.
# example
var1 = 'Python'
var2 = 'String'
print (var1+var2)
# PythonString
Repetition (*) This operator creates a new string by repeating it a given number of times. This operator creates a new string by repeating it a given number of times.
# example
var1 = 'Python'
print (var1*3)
# PythonPythonPython
Slicing [ ] The slice operator prints the character at a given index. The slice operator prints the character at a given index.
# example
var1 = 'Python'
print (var1[2])
# t
Range Slicing [x:y] It prints the characters present in the given range. It prints the characters present in the given range.
# example
var1 = 'Python'
print (var1[2:5])
# tho
Membership (in) This operator returns 'True' value if the character is present in the given String. This operator returns 'True' value if the character is present in the given String.
# example
var1 = 'Python'
print ('n' in var1)
# True
Membership (not in) It returns 'True' value if the character is not present in the given String. It returns 'True' value if the character is not present in the given String.
# example
var1 = 'Python'
print ('N' not in var1)
# True
Iterating (for) With this operator, we can iterate through all the characters of a string. With this operator, we can iterate through all the characters of a string.
# example
for var in var1: print (var, end ="")
# Python
Raw String (r/R) We can use it to ignore the actual meaning of Escape characters inside a string. For this, we add 'r' or 'R' in front of the String. We can use it to ignore the actual meaning of Escape characters inside a string. For this, we add 'r' or 'R' in front of the String.
# example
print (r'\n')
# \n
print (R'\n')
# \n
TOC

String Formatting Operators in Python

Python Escape Characters

An Escape sequence starts with a backslash (\) which signals the compiler to treat it differently. Python subsystem automatically interprets an escape sequence irrespective of it is in a single-quoted or double-quoted Strings.

We need a way to tell Python that the double-quotes inside the string are not the string markup quotes. Instead, they are the part of the String and should appear in the output. To resolve this issue, we can escape the double-quotes and single-quotes as: We need a way to tell Python that the double-quotes inside the string are not the string markup quotes. Instead, they are the part of the String and should appear in the output. To resolve this issue, we can escape the double-quotes and single-quotes as: To resolve this issue, we can escape the double-quotes and single-quotes as: To resolve this issue, we can escape the double-quotes and single-quotes as:

print ("Python is a "widely" used language")

# SyntaxError: invalid syntax

# After escaping with double-quotes

print ("Python is a \"widely\" used language")

# Output: Python is a "widely" used language
List of Escape Characters
Here is the complete list of escape characters that are represented using backslash notation.

\\ Backslash (\)

\" Double-quote (")

\a ASCII bell (BEL)

\b ASCII backspace (BS)

\cx or \Cx Control-x

\f ASCII Form feed (FF)

\n ASCII linefeed (LF)

\N{name} Character named name in the Unicode database (Unicode only)

\r Carriage Return (CR)

\t Horizontal Tab (TAB)

\uxxxx A character with 16-bit hex value xxxx (Unicode only)

\Uxxxxxxxx A character with 32-bit hex value xxxxxxxx (Unicode only)

\v ASCII vertical tab (VT)

\ooo Characters with octal value ooo

\xnn A character with hex value nn where n can be anything from the range 0-9, a-f or A-F.

... ... ...

Here's a simple example. Here's a simple example. Here's a simple example.

print ("Employee Name: %s,\nEmployee Age:%d" % ('Ashish',25))
# Employee Name: Ashish, 
# Employee Age: 25
List of Format Symbols

Following is the table containing the complete list of symbols that you can use with the '%' operator.

  • %c character %s string conversion via str() before formatting %i signed decimal integer
  • %d signed decimal integer
  • %u unsigned decimal integer
  • %o octal integer
  • %x hexadecimal integer (lowercase letters)
  • %X hexadecimal integer (UPPER-case letters)
  • %e exponential notation (with lowercase 'e')
  • %E exponential notation (with UPPER-case 'E')
  • %f floating-point real number
  • %g the shorter of
  • %f and
  • %e
  • %G the shorter of %f and %E

Unicode String support in Python

Regular Strings stores as the 8-bit ASCII value whereas Unicode String follows the 16-bit ASCII standard. This extension allows the strings to include characters from the different languages of the world. In Python, the letter 'u' works as a prefix to distinguish between Unicode and usual strings.

print (u' Hello Python!!')
#Hello Python

Suggested Reading:

Python Online Practice Test

TOC

Built-in String Functions in Python Conversion Functions

1. capitalize() – Returns the string with the first character capitalized and rest of the characters in lower case.

var = 'PYTHON'
print (var.capitalize())
# Python

2. lower() – Converts all the characters of the String to lowercase

var = 'TechBeamers'
print (var.lower())
# techbeamers

3. upper() – Converts all the characters of the String to uppercase

var = 'TechBeamers'
print (var.upper())
# TECHBEAMERS

4. swapcase() – Swaps the case of every character in the String means that lowercase characters got converted to uppercase and vice-versa.

var = 'TechBeamers'
print (var.swapcase())
# tECHbEAMERS

5. title() – Returns the 'titlecased' version of String, which means that all words start with uppercase and the rest of the characters in words are in lowercase.

var = 'welcome to Python programming'
print (var.title())
# Welcome To Python Programming

6. count( str[, beg [, end]]) – Returns the number of times substring 'str' occurs in the range [beg, end] if beg and end index are given else the search continues in full String Search is case-sensitive.

var='TechBeamers'
str='e'
print (var.count(str))
# 3
var1='Eagle Eyes'
print (var1.count('e'))
# 2
var2='Eagle Eyes'
print (var2.count('E',0,5))
# 1

TOC

Comparison Functions – Part1

1. islower() – Returns 'True' if all the characters in the String are in lowercase. If any of the char is in uppercase, it will return False.

var='Python'
print (var.islower())
# False

var='python'
print (var.islower())
# True

2. isupper() – Returns 'True' if all the characters in the String are in uppercase. If any of the char is in lowercase, it will return False.

var='Python'
print (var.isupper())
# False

var='PYTHON'
print (var.isupper())
# True

3. isdecimal() – Returns 'True' if all the characters in String are decimal. If any character in the String is of other data-type, it will return False.

Decimal characters are those from the Unicode category Nd.

num=u'2016'
print (num.isdecimal())
# True

4. isdigit() – Returns 'True' for any char for which isdecimal() would return 'True and some characters in the 'No' category. If there are any characters other than these, it will return False'.

Precisely, digits are the characters for which Unicode property includes: Numeric_Type=Digit or Numeric_Type=Decimal.

For example, superscripts are digits but fractions not.

print ('2'.isdigit())
# True

print ('²'.isdigit())
# True
Comparison Functions – Part2

1. isnumeric() – Returns 'True' if all the characters of the Unicode String lie in any one of the categories Nd', No, and NI.

If there are any characters other than these, it will return False.

Precisely, Numeric characters are those for which Unicode property includes: Numeric_Type=Digit, Numeric_Type=Decimal or Numeric_Type=Numeric.

num=u'2016'
print (num.isnumeric())
# True

num=u'year2016'
print (num.isnumeric())
# False

2. isalpha() – Returns 'True' if String contains at least one character (non-empty String) and all the characters are alphabetic, 'False' otherwise.

print ('python'.isalpha())
# True

print ('python3'.isalpha())
# False

3. isalnum() – Returns 'True' if String contains at least one character (non-empty String) and all the characters are either alphabetic or decimal digits, 'False' otherwise.

print ('python'.isalnum())
# True
print ('python3'.isalnum())
# True

TOC

Padding Functions

1. rjust(width[,fillchar]) – Returns string filled with input char while pushing the original content on the right side.

By default, the padding uses a space. Otherwise 'fillchar' specifies the filler character.

var='Python'
print (var.rjust(10))
# Python

print (var.rjust(10,'-'))
# ----Python

2. ljust(width[,fillchar]) – Returns a padded version of String with the original String left-justified to a total of width columns

By default, the padding uses a space. Otherwise 'fillchar' specifies the filler character.

var='Python'
print (var.ljust(10))
# Python

print (var.ljust(10,'-'))
# Python----

3. center(width[,fillchar]) – Returns string filled with the input char while pushing the original content into the center.

By default, the padding uses a space. Otherwise 'fillchar' specifies the filler character.

var='Python'
print (var.center(20))
# Python

print (var.center(20,'*'))
# *******Python*******

4. zfill(width) – Returns string filled with the original content padded on the left with zeros so that total length of String becomes equal to the input size.

If there is a leading sign (+/-) present in the String, then with this function padding starts after the symbol, not before it.

var='Python'
print (var.zfill(10))
# 0000Python

var='+Python'
print (var.zfill(10))
# +000Python

TOC

Search Functions

1. find(str [,i [,j]]) – Searches for 'str' in complete String (if i and j not defined) or in a sub-string of String (if i and j are defined).This function returns the index if 'str' is found else returns '-1'.

Here, i=search starts from this index, j=search ends at this index.

var="Tech Beamers"
str="Beam"
print (var.find(str))
# 5

var="Tech Beamers"
str="Beam"
print (var.find(str,4))
# 5

var="Tech Beamers"
str="Beam"
print (var.find(str,7))
# -1

2. index(str[,i [,j]]) – This is same as 'find' method. The only difference is that it raises 'ValueError' exception if 'str' doesn't exist.

var='Tech Beamers'
str='Beam'
print (var.index(str))
# 5

var='Tech Beamers'
str='Beam'
print (var.index(str,4))
# 5

var='Tech Beamers'
str='Beam'
print (var.index(str,7))
# ValueError: substring not found

3. rfind(str[,i [,j]]) – This is same as find() just that this function returns the last index where 'str' is found. If 'str' is not found, it returns '-1'.

var='This is a good example'
str='is'
print (var.rfind(str,0,10))
# 5

print (var.rfind(str,10))
# -1

4. count(str[,i [,j]]) – Returns the number of occurrences of substring 'str' in the String. Searches for 'str' in the complete String (if i and j not defined) or in a sub-string of String (if i and j are defined).

Where: i=search starts from this index, j=search ends at this index.

var='This is a good example'
str='is'
print (var.count(str))
# 2

print (var.count(str,4,10))
# 1

TOC

String Substitution Functions

1. replace(old,new[,count]) – Replaces all the occurrences of substring 'old' with 'new' in the String.

If the count is available, then only 'count' number of occurrences of 'old' will be replaced with the 'new' var.

Where old =substring to replace, new =substring

var='This is a good example'
str='was'
print (var.replace('is',str))
# Thwas was a good exampleprint (var.replace('is',str,1))
# Thwas is a good example

2. split([sep[,maxsplit]]) – Returns a list of substring obtained after splitting the String with 'sep' as a delimiter.

Where, sep= delimiter, the default is space, maxsplit= number of splits to be done

var = "This is a good example"
print (var.split())
# ['This', 'is', 'a', 'good', 'example']print (var.split(' ', 3))
# ['This', 'is', 'a', 'good example']

3. splitlines(num) – Splits the String at line breaks and returns the list after removing the line breaks.

Where, num = if this is a positive value. It indicates that line breaks will appear in the returned list.

var='Print new line\nNextline\n\nMove again to new line'
print (var.splitlines())
# ['Print new line', 'Nextline', '', 'Move again to new line']print (var.splitlines(1))
# ['Print new line\n', 'Nextline\n', '\n', 'Move again to new line']

4. join(seq) – Returns a String obtained after concatenating the sequence 'seq' with a delimiter string.

Where: the seq= sequence of elements to join

seq=('ab','bc','cd')
str='='
print (str.join(seq))
# ab=bc=cd

TOC

Misc String Functions

1. lstrip([chars]) – Returns a string after removing the characters from the beginning of the String.

Where: Chars=this is the character to be trimmed from the String.

The default is whitespace character.

var=' This is a good example '
print (var.lstrip())
# This is a good example
var='*****This is a good example*****'
print (var.lstrip('*'))
# This is a good example**********

2. rstrip() – Returns a string after removing the characters from the End of the String.

Where: Chars=this is the character to be trimmed from the String. The default is whitespace character.

var=' This is a good example '
print (var.rstrip())
# This is a good example
var='*****This is a good example*****'
print (var.lstrip('*'))
# *****This is a good example

3. rindex(str[,i [,j]]) – Searches for 'str' in the complete String (if i and j not defined) or in a sub-string of String (if i and j are defined). This function returns the last index where 'str' is available.

If 'str' is not there, then it raises a ValueError exception.

Where: i=search starts from this index, j=search ends at this index.

var='This is a good example'
str='is'
print (var.rindex(str,0,10))
# 5
print (var.rindex(str,10))
# ValueError: substring not found

4. len(string) – Returns the length of given String

var='This is a good example'
print (len(var))
# 22

TOC

In this post, we tried to cover most of the string functionality available in Python. Hope that you would now have a better understanding of the Python strings.

If you have any question regarding Python strings, please let us know. We'll try to solve it at the earliest possible time.

In Python 3.6, a new style known as f-strings got introduced, do go through it.

All the Best,

TechBeamers

[Oct 14, 2019] Coroutines and Tasks -- Python 3.7.5rc1 documentation

Oct 14, 2019 | docs.python.org

Coroutines declared with async/await syntax is the preferred way of writing asyncio applications. For example, the following snippet of code (requires Python 3.7+) prints "hello", waits 1 second, and then prints "world":

>>>

>>> import asyncio

>>> async def main():
...     print('hello')
...     await asyncio.sleep(1)
...     print('world')

>>> asyncio.run(main())
hello
world

Note that simply calling a coroutine will not schedule it to be executed:

>>>
>>> main()
<coroutine object main at 0x1053bb7c8>

To actually run a coroutine, asyncio provides three main mechanisms:

  • The asyncio.run() function to run the top-level entry point "main()" function (see the above example.)
  • Awaiting on a coroutine. The following snippet of code will print "hello" after waiting for 1 second, and then print "world" after waiting for another 2 seconds:
    import asyncio
    import time
    
    async def say_after(delay, what):
        await asyncio.sleep(delay)
        print(what)
    
    async def main():
        print(f"started at {time.strftime('%X')}")
    
        await say_after(1, 'hello')
        await say_after(2, 'world')
    
        print(f"finished at {time.strftime('%X')}")
    
    asyncio.run(main())
    

    Expected output:

    started at 17:13:52
    hello
    world
    finished at 17:13:55
    
  • The asyncio.create_task() function to run coroutines concurrently as asyncio Tasks .

    Let's modify the above example and run two say_after coroutines concurrently :

    async def main():
        task1 = asyncio.create_task(
            say_after(1, 'hello'))
    
        task2 = asyncio.create_task(
            say_after(2, 'world'))
    
        print(f"started at {time.strftime('%X')}")
    
        # Wait until both tasks are completed (should take
        # around 2 seconds.)
        await task1
        await task2
    
        print(f"finished at {time.strftime('%X')}")
    

    Note that expected output now shows that the snippet runs 1 second faster than before:

    started at 17:14:32
    hello
    world
    finished at 17:14:34
    
Awaitables ¶

We say that an object is an awaitable object if it can be used in an await expression. Many asyncio APIs are designed to accept awaitables.

There are three main types of awaitable objects: coroutines , Tasks , and Futures .

Coroutines

Python coroutines are awaitables and therefore can be awaited from other coroutines:

import asyncio

async def nested():
    return 42

async def main():
    # Nothing happens if we just call "nested()".
    # A coroutine object is created but not awaited,
    # so it *won't run at all*.
    nested()

    # Let's do it differently now and await it:
    print(await nested())  # will print "42".

asyncio.run(main())

Important

In this documentation the term "coroutine" can be used for two closely related concepts:

  • a coroutine function : an async def function;
  • a coroutine object : an object returned by calling a coroutine function .

asyncio also supports legacy generator-based coroutines.

[Oct 14, 2019] Coroutine in Python - GeeksforGeeks

Oct 14, 2019 | www.geeksforgeeks.org

In Python 2.5, a slight modification to the yield statement was introduced, now yield can also be used as expression . For example on the right side of the assignment

line = (yield)

whatever value we send to coroutine is captured and returned by (yield) expression. A value can be send to the coroutine by send() method. For example, consider this coroutine which print out name having prefix "Dear" in it. We will send names to coroutine using send() method.

# Python3 program for demonstrating

# coroutine execution

def print_name(prefix):

print ( "Searching prefix:{}" . format (prefix))

while True :

name = ( yield )

if prefix in name:

print (name) # calling coroutine, nothing will happen

corou = print_name( "Dear" )

# This will start execution of coroutine and

# Prints first line "Searchig prefix..."

# and advance execution to the first yield expression

corou.__next__() # sending inputs

corou.send( "Atul" )

corou.send( "Dear Atul" )

Output:

Searching prefix:Dear
Dear Atul

Execution of Coroutine

Execution of coroutine is similar to the generator. When we call coroutine nothing happens, it runs only in response to the next() and send() method. This can be seen clearly in above example, as only after calling __next__() method, out coroutine starts executing. After this call, execution advances to the first yield expression, now execution pauses and wait for value to be sent to corou object. When first value is sent to it, it checks for prefix and print name if prefix present. After printing name it goes through loop until it encounters name = (yield) expression again.

Closing a Coroutine

Coroutine might run indefinitely, to close coroutine close() method is used. When coroutine is closed it generates GeneratorExit exception which can be catched in usual way. After closing coroutine, if we try to send values, it will raise StopIteration exception. Following is a simple example :

# Python3 program for demonstrating

# closing a coroutine

def print_name(prefix):

print ( "Searching prefix:{}" . format (prefix))

try :

while True :

name = ( yield )

if prefix in name:

print (name)

except GeneratorExit:

print ( "Closing coroutine!!" )

corou = print_name( "Dear" )

corou.__next__()

corou.send( "Atul" )

corou.send( "Dear Atul" )

corou.close()

Output:

Searching prefix:Dear
Dear Atul
Closing coroutine!!

Chaining coroutines for creating pipeline

Coroutines can be used to set pipes. We can chain together coroutines and push data through pipe using send() method. A pipe needs :

  • An initial source (producer) which derives the whole pipe line. Producer is usually not a coroutine, it's just a simple method.
  • A sink , which is the end point of the pipe. A sink might collect all data and display it.

pipeline
Following is a simple example of chaining –

# Python3 program for demonstrating

# coroutine chaining

def producer(sentence, next_coroutine):

'''

Producer which just split strings and

feed it to pattern_filter coroutine

'''

tokens = sentence.split( " " )

for token in tokens:

next_coroutine.send(token)

next_coroutine.close()

def pattern_filter(pattern = "ing" , next_coroutine = None ):

'''

Search for pattern in received token

and if pattern got matched, send it to

print_token() coroutine for printing

'''

print ( "Searching for {}" . format (pattern))

try :

while True :

token = ( yield )

if pattern in token:

next_coroutine.send(token)

except GeneratorExit:

print ( "Done with filtering!!" ) def print_token():

'''

Act as a sink, simply print the

received tokens

'''

print ( "I'm sink, i'll print tokens" )

try :

while True :

token = ( yield )

print (token)

except GeneratorExit:

print ( "Done with printing!" )

pt = print_token()

pt.__next__()

pf = pattern_filter(next_coroutine = pt)

pf.__next__() sentence = "Bob is running behind a fast moving car"

producer(sentence, pf)

Output:

I'm sink, i'll print tokens
Searching for ing
running
moving
Done with filtering!!
Done with printing!

References

This article is contributed by Atul Kumar . If you like GeeksforGeeks and would like to contribute, you can also write an article using contribute.geeksforgeeks.org or mail your article to contribute@geeksforgeeks.org. See your article appearing on the GeeksforGeeks main page and help other Geeks.

Please write comments if you find anything incorrect, or you want to share more information about the topic discussed above.

[Oct 13, 2019] Python generators and coroutines

The new syntax is 'yield from' ( PEP 380 ) and it allows true coroutines in Python >3.3
Nov 16, 2017 | stackoverflow.com

Python generators and coroutines Ask Question up vote down vote favorite 6

Giuseppe Maggiore ,May 10, 2011 at 10:25

I am studying coroutines and generators in various programming languages.

I was wondering if there is a cleaner way to combine together two coroutines implemented via generators than yielding back at the caller whatever the callee yields?

Let's say that we are using the following convention: all yields apart from the last one return null, while the last one returns the result of the coroutine. So, for example, we could have a coroutine that invokes another:

def A():
  # yield until a certain condition is met
  yield result

def B():
  # do something that may or may not yield
  x = bind(A())
  # ...
  return result

in this case I wish that through bind (which may or may not be implementable, that's the question) the coroutine B yields whenever A yields until A returns its final result, which is then assigned to x allowing B to continue.

I suspect that the actual code should explicitly iterate A so:

def B():
  # do something that may or may not yield
  for x in A(): ()
  # ...
  return result

which is a tad ugly and error prone...

PS: it's for a game where the users of the language will be the designers who write scripts (script = coroutine). Each character has an associated script, and there are many sub-scripts which are invoked by the main script; consider that, for example, run_ship invokes many times reach_closest_enemy, fight_with_closest_enemy, flee_to_allies, and so on. All these sub-scripts need to be invoked the way you describe above; for a developer this is not a problem, but for a designer the less code they have to write the better!

S.Lott ,May 10, 2011 at 10:38

This is all covered on the Python web site. python.org/dev/peps/pep-0342 , python.org/dev/peps/pep-0334 and numerous blogs cover this. eecho.info/Echo/python/coroutines-python . Please Google, read, and then ask specific questions based on what you've read. – S.Lott May 10 '11 at 10:38

S.Lott ,May 10, 2011 at 13:04

I thought the examples clearly demonstrated idiomatic. Since I'm unable to understand what's wrong with the examples, could you state which examples you found to be unclear? Which examples were confusing? Can you be more specific on how all those examples where not able to show idiomatic Python? – S.Lott May 10 '11 at 13:04

Giuseppe Maggiore ,May 10, 2011 at 13:09

I've read precisely those articles, and the PEP-342 leaves me somewhat confused: is it some actual extension that is currently working in Python? Is the Trampoline class shown there part of the standard libraries of the language? BTW, my question was very precise, and it was about the IDIOMATIC way to pass control around coroutines. The fact that I can read about a ton of ways to do so really does not help. Neither does your snarkiness... – Giuseppe Maggiore May 10 '11 at 13:09

Giuseppe Maggiore ,May 10, 2011 at 13:11

Idiomatic is about the "standard" way to perform some function; there is absolutely nothing wrong with iterating the results of a nested coroutine, but there are examples in the literature of programming languages where yielding automatically climbs down the call stack and so you do not need to re-yield at each caller, hence my curiosity if this pattern is covered by sintactic sugar in Python or not! – Giuseppe Maggiore May 10 '11 at 13:11

S.Lott ,May 10, 2011 at 13:19

@Giuseppe Maggiore: "programming languages where yielding automatically climbs down the call stack" That doesn't sound like the same question. Are you asking for idiomatic Python -- as shown by numerous examples -- or are you asking for some other feature that's not shown in the Python examples but is shown in other languages? I'm afraid that I can't understand your question at all. Can you please clarify what you're really looking for? – S.Lott May 10 '11 at 13:19

blubb ,May 10, 2011 at 10:37

Are you looking for something like this?
def B():
   for x in A():
     if x is None:
       yield
     else:
       break

   # continue, x contains value A yielded

Giuseppe Maggiore ,May 10, 2011 at 12:59

Yes, that is what I am doing. My question is if this is the idiomatic way or if there is some syntactic construct that is capable of hiding this pattern which recurs very often in my application. – Giuseppe Maggiore May 10 '11 at 12:59

blubb ,May 10, 2011 at 13:31

@Guiseppe Maggiore: I'm not aware of any such constructs. However, it seems strange that you need this pattern often... I can't think of many valid used cases off the top of my head. If you give more context information, maybe we can propose an alternative solution which is more elegant over all? – blubb May 10 '11 at 13:31

Giuseppe Maggiore ,May 10, 2011 at 15:17

It's for a game where the users of the language will be the designers who write scripts (script = coroutine). Each character has an associated script, and there are many sub-scripts which are invoked by the main script; consider that, for example, run_ship invokes many times reach_closest_enemy, fight_with_closest_enemy, flee_to_allies, and so on. All these sub-scripts need to be invoked the way you describe above; for a developer this is not a problem, but for a designer the less code they have to write the better! – Giuseppe Maggiore May 10 '11 at 15:17

blubb ,May 10, 2011 at 15:57

@Guiseppe Maggiore: I'd propose you add that last comment to the question so that other get a chance of answering it, too... – blubb May 10 '11 at 15:57

Simon Radford ,Nov 11, 2011 at 0:24

Edit: I recommend using Greenlet . But if you're interested in a pure Python approach, read on.

This is addressed in PEP 342 , but it's somewhat tough to understand at first. I'll try to explain simply how it works.

First, let me sum up what I think is the problem you're really trying to solve.

Problem

You have a callstack of generator functions calling other generator functions. What you really want is to be able to yield from the generator at the top, and have the yield propagate all the way down the stack.

The problem is that Python does not ( at a language level ) support real coroutines, only generators. (But, they can be implemented.) Real coroutines allow you to halt an entire stack of function calls and switch to a different stack. Generators only allow you to halt a single function. If a generator f() wants to yield, the yield statement has to be in f(), not in another function that f() calls.

The solution that I think you're using now, is to do something like in Simon Stelling's answer (i.e. have f() call g() by yielding all of g()'s results). This is very verbose and ugly, and you're looking for syntax sugar to wrap up that pattern. Note that this essentially unwinds the stack every time you yield, and then winds it back up again afterwards.

Solution

There is a better way to solve this problem. You basically implement coroutines by running your generators on top of a "trampoline" system.

To make this work, you need to follow a couple patterns: 1. When you want to call another coroutine, yield it. 2. Instead of returning a value, yield it.

so

def f():
    result = g()
    #  
    return return_value

becomes

def f():
    result = yield g()
    #  
    yield return_value

Say you're in f(). The trampoline system called f(). When you yield a generator (say g()), the trampoline system calls g() on your behalf. Then when g() has finished yielding values, the trampoline system restarts f(). This means that you're not actually using the Python stack; the trampoline system manages a callstack instead.

When you yield something other than a generator, the trampoline system treats it as a return value. It passes that value back to the caller generator through the yield statement (using .send() method of generators).

Comments

This kind of system is extremely important and useful in asynchronous applications, like those using Tornado or Twisted. You can halt an entire callstack when it's blocked, go do something else, and then come back and continue execution of the first callstack where it left off.

The drawback of the above solution is that it requires you to write essentially all your functions as generators. It may be better to use an implementation of true coroutines for Python - see below.

Alternatives

There are several implementations of coroutines for Python, see: http://en.wikipedia.org/wiki/Coroutine#Implementations_for_Python

Greenlet is an excellent choice. It is a Python module that modifies the CPython interpreter to allow true coroutines by swapping out the callstack.

Python 3.3 should provide syntax for delegating to a subgenerator, see PEP 380 .

gaborous ,Nov 9, 2012 at 10:04

Very useful and clear answer, thank's! However, when you say that standard Python coroutines essentially require to write all functions as generators, did you mean only first level functions or really all functions? As you said above, when yielding something other than a generator, the trampoline system still works, so theoretically we can just yield at the first-layer functions any other functions that may or may not be generators themselves. Am I right? – gaborous Nov 9 '12 at 10:04

Simon Radford ,Nov 21, 2012 at 21:37

All "functions" between the trampoline system and a yield must be written as generators. You can call regular functions normally, but then you can't effectively "yield" from that function or any functions it calls. Does that make sense / answer your question? – Simon Radford Nov 21 '12 at 21:37

Simon Radford ,Nov 21, 2012 at 21:39

I highly recommend using Greenlet - it's a true implementation of coroutines for Python, and you don't have to use any of these patterns I've described. The trampoline stuff is for people who are interested in how you can do it in pure Python. – Simon Radford Nov 21 '12 at 21:39

Nick Sweeting ,Jun 7, 2015 at 22:12

To anyone reading this in 2015 or later, the new syntax is 'yield from' ( PEP 380 ) and it allows true coroutines in Python >3.3 . – Nick Sweeting Jun 7 '15 at 22:12

[Oct 13, 2019] Effective Python Item 40 Consider Coroutines to Run Many Functions Concurrently

Nov 16, 2017 | www.informit.com

Threads give Python programmers a way to run multiple functions seemingly at the same time (see Item 37: "Use Threads for Blocking I/O, Avoid for Parallelism"). But there are three big problems with threads:

  • They require special tools to coordinate with each other safely (see Item 38: "Use Lock to Prevent Data Races in Threads" and Item 39: "Use Queue to Coordinate Work Between Threads"). This makes code that uses threads harder to reason about than procedural, single-threaded code. This complexity makes threaded code more difficult to extend and maintain over time.
  • Threads require a lot of memory, about 8 MB per executing thread. On many computers, that amount of memory doesn't matter for a dozen threads or so. But what if you want your program to run tens of thousands of functions "simultaneously"? These functions may correspond to user requests to a server, pixels on a screen, particles in a simulation, etc. Running a thread per unique activity just won't work.
  • Threads are costly to start. If you want to constantly be creating new concurrent functions and finishing them, the overhead of using threads becomes large and slows everything down.

Python can work around all these issues with coroutines . Coroutines let you have many seemingly simultaneous functions in your Python programs. They're implemented as an extension to generators. The cost of starting a generator coroutine is a function call. Once active, they each use less than 1 KB of memory until they're exhausted.

Coroutines work by enabling the code consuming a generator to send a value back into the generator function after each yield expression. The generator function receives the value passed to the send function as the result of the corresponding yield expression.

def my_coroutine():
    while True:
        received = yield
        print('Received:', received)

it = my_coroutine()
next(it)             # Prime the coroutine
it.send('First')
it.send('Second')

>>>
Received: First
Received: Second

The initial call to next is required to prepare the generator for receiving the first send by advancing it to the first yield expression. Together, yield and send provide generators with a standard way to vary their next yielded value in response to external input.

For example, say you want to implement a generator coroutine that yields the minimum value it's been sent so far. Here, the bare yield prepares the coroutine with the initial minimum value sent in from the outside. Then the generator repeatedly yields the new minimum in exchange for the next value to consider.

def minimize():
    current = yield
    while True:
        value = yield current
        current = min(value, current)

The code consuming the generator can run one step at a time and will output the minimum value seen after each input.

it = minimize()
next(it)            # Prime the generator
print(it.send(10))
print(it.send(4))
print(it.send(22))
print(it.send(-1))

>>>
10
4
4
-1

The generator function will seemingly run forever, making forward progress with each new call to send . Like threads, coroutines are independent functions that can consume inputs from their environment and produce resulting outputs. The difference is that coroutines pause at each yield expression in the generator function and resume after each call to send from the outside. This is the magical mechanism of coroutines.

This behavior allows the code consuming the generator to take action after each yield expression in the coroutine. The consuming code can use the generator's output values to call other functions and update data structures. Most importantly, it can advance other generator functions until their next yield expressions. By advancing many separate generators in lockstep, they will all seem to be running simultaneously, mimicking the concurrent behavior of Python threads.

The Game of Life

Let me demonstrate the simultaneous behavior of coroutines with an example. Say you want to use coroutines to implement Conway's Game of Life. The rules of the game are simple. You have a two-dimensional grid of an arbitrary size. Each cell in the grid can either be alive or empty.

ALIVE = '*'
EMPTY = '-'

The game progresses one tick of the clock at a time. At each tick, each cell counts how many of its neighboring eight cells are still alive. Based on its neighbor count, each cell decides if it will keep living, die, or regenerate. Here's an example of a 5×5 Game of Life grid after four generations with time going to the right. I'll explain the specific rules further below.

  0   |   1   |   2   |   3   |   4
----- | ----- | ----- | ----- | -----
-*--- | --*-- | --**- | --*-- | -----
--**- | --**- | -*--- | -*--- | -**--
---*- | --**- | --**- | --*-- | -----
----- | ----- | ----- | ----- | -----

I can model this game by representing each cell as a generator coroutine running in lockstep with all the others.

To implement this, first I need a way to retrieve the status of neighboring cells. I can do this with a coroutine named count_neighbors that works by yielding Query objects. The Query class I define myself. Its purpose is to provide the generator coroutine with a way to ask its surrounding environment for information.

Query = namedtuple('Query', ('y', 'x'))

The coroutine yields a Query for each neighbor. The result of each yield expression will be the value ALIVE or EMPTY . That's the interface contract I've defined between the coroutine and its consuming code. The count_neighbors generator sees the neighbors' states and returns the count of living neighbors.

def count_neighbors(y, x):
    n_ = yield Query(y + 1, x + 0)  # North
    ne = yield Query(y + 1, x + 1)  # Northeast
    # Define e_, se, s_, sw, w_, nw ...
    # ...
    neighbor_states = [n_, ne, e_, se, s_, sw, w_, nw]
    count = 0
    for state in neighbor_states:
        if state == ALIVE:
            count += 1
    return count

I can drive the count_neighbors coroutine with fake data to test it. Here, I show how Query objects will be yielded for each neighbor. count_neighbors expects to receive cell states corresponding to each Query through the coroutine's send method. The final count is returned in the StopIteration exception that is raised when the generator is exhausted by the return statement.

it = count_neighbors(10, 5)
q1 = next(it)                  # Get the first query
print('First yield: ', q1)
q2 = it.send(ALIVE)            # Send q1 state, get q2
print('Second yield:', q2)
q3 = it.send(ALIVE)            # Send q2 state, get q3
# ...
try:
    count = it.send(EMPTY)     # Send q8 state, retrieve count
except StopIteration as e:
    print('Count: ', e.value)  # Value from return statement
>>>
First yield:  Query(y=11, x=5)
Second yield: Query(y=11, x=6)
...
Count:  2

Now I need the ability to indicate that a cell will transition to a new state in response to the neighbor count that it found from count_neighbors . To do this, I define another coroutine called step_cell . This generator will indicate transitions in a cell's state by yielding Transition objects. This is another class that I define, just like the Query class.

Transition = namedtuple('Transition', ('y', 'x', 'state'))

The step_cell coroutine receives its coordinates in the grid as arguments. It yields a Query to get the initial state of those coordinates. It runs count_neighbors to inspect the cells around it. It runs the game logic to determine what state the cell should have for the next clock tick. Finally, it yields a Transition object to tell the environment the cell's next state.

def game_logic(state, neighbors):
    # ...

def step_cell(y, x):
    state = yield Query(y, x)
    neighbors = yield from count_neighbors(y, x)
    next_state = game_logic(state, neighbors)
    yield Transition(y, x, next_state)

Importantly, the call to count_neighbors uses the yield from expression. This expression allows Python to compose generator coroutines together, making it easy to reuse smaller pieces of functionality and build complex coroutines from simpler ones. When count_neighbors is exhausted, the final value it returns (with the return statement) will be passed to step_cell as the result of the yield from expression.

Now, I can finally define the simple game logic for Conway's Game of Life. There are only three rules.

def game_logic(state, neighbors):
    if state == ALIVE:
        if neighbors < 2:
            return EMPTY     # Die: Too few
        elif neighbors > 3:
            return EMPTY     # Die: Too many
    else:
        if neighbors == 3:
            return ALIVE     # Regenerate
    return state

I can drive the step_cell coroutine with fake data to test it.

it = step_cell(10, 5)
q0 = next(it)           # Initial location query
print('Me:      ', q0)
q1 = it.send(ALIVE)     # Send my status, get neighbor query
print('Q1:      ', q1)
# ...
t1 = it.send(EMPTY)     # Send for q8, get game decision
print('Outcome: ', t1)

>>>
Me:       Query(y=10, x=5)
Q1:       Query(y=11, x=5)
...
Outcome:  Transition(y=10, x=5, state='-')

The goal of the game is to run this logic for a whole grid of cells in lockstep. To do this, I can further compose the step_cell coroutine into a simulate coroutine. This coroutine progresses the grid of cells forward by yielding from step_cell many times. After progressing every coordinate, it yields a TICK object to indicate that the current generation of cells have all transitioned.

TICK = object()

def simulate(height, width):
    while True:
        for y in range(height):
            for x in range(width):
                yield from step_cell(y, x)
        yield TICK

What's impressive about simulate is that it's completely disconnected from the surrounding environment. I still haven't defined how the grid is represented in Python objects, how Query , Transition , and TICK values are handled on the outside, nor how the game gets its initial state. But the logic is clear. Each cell will transition by running step_cell . Then the game clock will tick. This will continue forever, as long as the simulate coroutine is advanced.

This is the beauty of coroutines. They help you focus on the logic of what you're trying to accomplish. They decouple your code's instructions for the environment from the implementation that carries out your wishes. This enables you to run coroutines seemingly in parallel. This also allows you to improve the implementation of following those instructions over time without changing the coroutines.

Now, I want to run simulate in a real environment. To do that, I need to represent the state of each cell in the grid. Here, I define a class to contain the grid:

class Grid(object):
    def __init__(self, height, width):
        self.height = height
        self.width = width
        self.rows = []
        for _ in range(self.height):
            self.rows.append([EMPTY] * self.width)

    def __str__(self):
        # ...

The grid allows you to get and set the value of any coordinate. Coordinates that are out of bounds will wrap around, making the grid act like infinite looping space.

    def query(self, y, x):
        return self.rows[y % self.height][x % self.width]

    def assign(self, y, x, state):
        self.rows[y % self.height][x % self.width] = state

At last, I can define the function that interprets the values yielded from simulate and all of its interior coroutines. This function turns the instructions from the coroutines into interactions with the surrounding environment. It progresses the whole grid of cells forward a single step and then returns a new grid containing the next state.

def live_a_generation(grid, sim):
    progeny = Grid(grid.height, grid.width)
    item = next(sim)
    while item is not TICK:
        if isinstance(item, Query):
            state = grid.query(item.y, item.x)
            item = sim.send(state)
        else:  # Must be a Transition
            progeny.assign(item.y, item.x, item.state)
            item = next(sim)
    return progeny

To see this function in action, I need to create a grid and set its initial state. Here, I make a classic shape called a glider.

grid = Grid(5, 9)
grid.assign(0, 3, ALIVE)
# ...
print(grid)

>>>
---*-----
----*----
--***----
---------
---------

Now I can progress this grid forward one generation at a time. You can see how the glider moves down and to the right on the grid based on the simple rules from the game_logic function.

class ColumnPrinter(object):
    # ...

columns = ColumnPrinter()
sim = simulate(grid.height, grid.width)
for i in range(5):
    columns.append(str(grid))
    grid = live_a_generation(grid, sim)

print(columns)

>>>
    0     |     1     |     2     |     3     |     4
---*----- | --------- | --------- | --------- | ---------
----*---- | --*-*---- | ----*---- | ---*----- | ----*----
--***---- | ---**---- | --*-*---- | ----**--- | -----*---
--------- | ---*----- | ---**---- | ---**---- | ---***---
--------- | --------- | --------- | --------- | ---------

The best part about this approach is that I can change the game_logic function without having to update the code that surrounds it. I can change the rules or add larger spheres of influence with the existing mechanics of Query , Transition , and TICK . This demonstrates how coroutines enable the separation of concerns, which is an important design principle.

Coroutines in Python 2

Unfortunately, Python 2 is missing some of the syntactical sugar that makes coroutines so elegant in Python 3. There are two limitations. First, there is no yield from expression. That means that when you want to compose generator coroutines in Python 2, you need to include an additional loop at the delegation point.

# Python 2
def delegated():
    yield 1
    yield 2

def composed():
    yield 'A'
    for value in delegated():  # yield from in Python 3
        yield value
    yield 'B'

print list(composed())

>>>
['A', 1, 2, 'B']

The second limitation is that there is no support for the return statement in Python 2 generators. To get the same behavior that interacts correctly with try / except / finally blocks, you need to define your own exception type and raise it when you want to return a value.

# Python 2
class MyReturn(Exception):
    def __init__(self, value):
        self.value = value

def delegated():
    yield 1
    raise MyReturn(2)  # return 2 in Python 3
    yield 'Not reached'

def composed():
    try:
        for value in delegated():
            yield value
    except MyReturn as e:
        output = e.value
    yield output * 4

print list(composed())

>>>
[1, 8]
Things to Remember
  • Coroutines provide an efficient way to run tens of thousands of functions seemingly at the same time.
  • Within a generator, the value of the yield expression will be whatever value was passed to the generator's send method from the exterior code.
  • Coroutines give you a powerful tool for separating the core logic of your program from its interaction with the surrounding environment.
  • Python 2 doesn't support yield from or returning values from generators

[Oct 13, 2019] https://www.quora.com/If-Donald-Knuth-were-25-years-old-today-which-programming-language-would-he-choose

Notable quotes:
"... He mostly writes in C today. ..."
Oct 13, 2019 | www.quora.com

Eugene Miya , A friend/colleague. Sometimes driver. Other shared experiences. Updated Mar 22 2017 · Author has 11.2k answers and 7.9m answer views

He mostly writes in C today.

I can assure you he at least knows about Python. Guido's office at Dropbox is 1 -- 2 blocks by a backdoor gate from Don's house.

I would tend to doubt that he would use R (I've used S before as one of my stat packages). Don would probably write something for himself.

Don is not big on functional languages, so I would doubt either Haskell (sorry Paul) or LISP (but McCarthy lived just around the corner from Don; I used to drive him to meetings; actually, I've driven all 3 of us to meetings, and he got his wife an electric version of my car based on riding in my car (score one for friend's choices)). He does use emacs and he does write MLISP macros, but he believes in being closer to the hardware which is why he sticks with MMIX (and MIX) in his books.

Don't discount him learning the machine language of a given architecture.

I'm having dinner with Don and Jill and a dozen other mutual friends in 3 weeks or so (our quarterly dinner). I can ask him then, if I remember (either a calendar entry or at job). I try not to bother him with things like this. Don is well connected to the hacker community

Don's name was brought up at an undergrad architecture seminar today, but Don was not in the audience (an amazing audience; I took a photo for the collection of architects and other computer scientists in the audience (Hennessey and Patterson were talking)). I came close to biking by his house on my way back home.

We do have a mutual friend (actually, I introduced Don to my biology friend at Don's request) who arrives next week, and Don is my wine drinking proxy. So there is a chance I may see him sooner.

Steven de Rooij , Theoretical computer scientist Answered Mar 9, 2017 · Author has 4.6k answers and 7.7m answer views

Nice question :-)

Don Knuth would want to use something that’s low level, because details matter . So no Haskell; LISP is borderline. Perhaps if the Lisp machine ever had become a thing.

He’d want something with well-defined and simple semantics, so definitely no R. Python also contains quite a few strange ad hoc rules, especially in its OO and lambda features. Yes Python is easy to learn and it looks pretty, but Don doesn’t care about superficialities like that. He’d want a language whose version number is converging to a mathematical constant, which is also not in favor of R or Python.

What remains is C. Out of the five languages listed, my guess is Don would pick that one. But actually, his own old choice of Pascal suits him even better. I don’t think any languages have been invented since was written that score higher on the Knuthometer than Knuth’s own original pick.

And yes, I feel that this is actually a conclusion that bears some thinking about. 24.1k views ·

Dan Allen , I've been programming for 34 years now. Still not finished. Answered Mar 9, 2017 · Author has 4.5k answers and 1.8m answer views

In The Art of Computer Programming I think he'd do exactly what he did. He'd invent his own architecture and implement programs in an assembly language targeting that theoretical machine.

He did that for a reason because he wanted to reveal the detail of algorithms at the lowest level of detail which is machine level.

He didn't use any available languages at the time and I don't see why that would suit his purpose now. All the languages above are too high-level for his purposes.

[Sep 30, 2019] Get command line arguments as string

Sep 30, 2019 | stackoverflow.com

KocT9H ,Jun 6, 2016 at 12:58

I want to print all command line arguments as a single string. Example of how I call my script and what I expect to be printed:
./RunT.py mytst.tst -c qwerty.c

mytst.tst -c qwerty.c

The code that does that:

args = str(sys.argv[1:])
args = args.replace("[","")
args = args.replace("]","")
args = args.replace(",","")
args = args.replace("'","")
print args

I did all replaces because sys.argv[1:] returns this:

['mytst.tst', '-c', 'qwerty.c']

Is there a better way to get same result? I don't like those multiple replace calls

KocT9H ,Apr 5 at 12:16

An option:
import sys
' '.join(sys.argv[1:])

The join() function joins its arguments by whatever string you call it on. So ' '.join(...) joins the arguments with single spaces ( ' ' ) between them.

cxw ,Apr 5 at 12:18

The command line arguments are already handled by the shell before they are sent into sys.argv . Therefore, shell quoting and whitespace are gone and cannot be exactly reconstructed.

Assuming the user double-quotes strings with spaces, here's a python program to reconstruct the command string with those quotes.

commandstring = '';  

for arg in sys.argv[1:]:          # skip sys.argv[0] since the question didn't ask for it
    if ' ' in arg:
        commandstring+= '"{}"  '.format(arg) ;   # Put the quotes back in
    else:
        commandstring+="{}  ".format(arg) ;      # Assume no space => no quotes

print(commandstring);

For example, the command line

./saferm.py sdkf lsadkf -r sdf -f sdf -fs -s "flksjfksdkfj sdfsdaflkasdf"

will produce the same arguments as output:

sdkf lsadkf -r sdf -f sdf -fs -s "flksjfksdkfj sdfsdaflkasdf"

since the user indeed double-quoted only arguments with strings.

TryToSolveItSimple ,Apr 5 at 12:27

You're getting a list object with all of your arguments when you use the syntax [1:] which goes from the second argument to the last. You could run a for each loop to join them into one string:
args = sys.argv[1:]
result = ''

for arg in args:
    result += " " + arg

pushpen.paul ,Jul 13 at 1:39

None of the previous answers properly escape all possible arguments, like empty args or those containing quotes. The closest you can get with minimal code is to use shlex.quote (available since Python 3.3):
import shlex
cmdline = " ".join(map(shlex.quote, sys.argv[1:]))

[Sep 28, 2019] Python quoting conventions

Sep 28, 2019 | stackoverflow.com

Share a link to this answer Copy link edited Jul 7 '11 at 14:11 answered Sep 11 '08 at 10:06 Will Harris Will Harris 20.4k 11 11 gold badges 60 60 silver badges 63 63 bronze badges

  • 4 Interesting, I use them in exactly the same way. I don't remember ever reading anything to nudge me in that direction.
  • I also use triple single quotes for long string not intended for humans, like raw html. Maybe it's something to do with English quote rules. – Mike A Oct 21 '09 at 17:15
  • 12 Most python coders code it that way. There is no explicit rule, but because we often read the code that way, it becomes an habit. – e-satis Mar 8 '10 at 14:35
  • I wonder if the single quotes for symbol-like things actually comes from the quote expression shortcut in Lisp/Scheme. In any case, it's intuitive. Also, me mateys, if we're following PEP 8 style guidelines, the functions really should be named lights_message() and is_pirate(). – yukondude May 22 '10 at 17:42
  • 8 I think Perl made a distinction between single quoted strings (no interpolation) and double quoted strings (with interpolation) and that python coders might have inherited the habit or never let it go. – Daren Thomas May 18 '11 at 11:58
  • 2 I use the same convention, plus I abuse it by having vim highlight everything inside triple single quotes as SQL. – RoundTower Jan 16 '12 at 22:26
| show 2 more comments 96 votes

mlissner ,Aug 6, 2011 at 6:19

Quoting the official docs at https://docs.python.org/2.0/ref/strings.html :

In plain English: String literals can be enclosed in matching single quotes (') or double quotes (").

So there is no difference. Instead, people will tell you to choose whichever style that matches the context, and to be consistent . And I would agree - adding that it is pointless to try to come up with "conventions" for this sort of thing because you'll only end up confusing any newcomers.

eksortso ,Jun 10, 2013 at 23:36

I used to prefer ' , especially for '''docstrings''' , as I find """this creates some fluff""" . Also, ' can be typed without the Shift key on my Swiss German keyboard.

I have since changed to using triple quotes for """docstrings""" , to conform to PEP 257 .

Garth Kidd ,Sep 11, 2008 at 10:21

I'm with Will:
  • Double quotes for text
  • Single quotes for anything that behaves like an identifier
  • Double quoted raw string literals for regexps
  • Tripled double quotes for docstrings

I'll stick with that even if it means a lot of escaping.

I get the most value out of single quoted identifiers standing out because of the quotes. The rest of the practices are there just to give those single quoted identifiers some standing room.

Tony Meyer ,Sep 11, 2008 at 8:33

If the string you have contains one, then you should use the other. For example, "You're able to do this" , or 'He said "Hi!"' . Other than that, you should simply be as consistent as you can (within a module, within a package, within a project, within an organisation).

If your code is going to be read by people who work with C/C++ (or if you switch between those languages and Python), then using '' for single-character strings, and "" for longer strings might help ease the transition. (Likewise for following other languages where they are not interchangeable).

The Python code I've seen in the wild tends to favour " over ' , but only slightly. The one exception is that """these""" are much more common than '''these''' , from what I have seen.

jblocksom ,Jul 30, 2009 at 20:35

Triple quoted comments are an interesting subtopic of this question. PEP 257 specifies triple quotes for doc strings . I did a quick check using Google Code Search and found that triple double quotes in Python are about 10x as popular as triple single quotes -- 1.3M vs 131K occurrences in the code Google indexes. So in the multi line case your code is probably going to be more familiar to people if it uses triple double quotes.

Paolo ,Sep 30, 2013 at 15:39

"If you're going to use apostrophes, 
       ^

you'll definitely want to use double quotes".
   ^

For that simple reason, I always use double quotes on the outside. Always

Speaking of fluff, what good is streamlining your string literals with ' if you're going to have to use escape characters to represent apostrophes? Does it offend coders to read novels? I can't imagine how painful high school English class was for you!

dolma33 ,Mar 17, 2014 at 23:13

Python uses quotes something like this:
mystringliteral1="this is a string with 'quotes'"
mystringliteral2='this is a string with "quotes"'
mystringliteral3="""this is a string with "quotes" and more 'quotes'"""
mystringliteral4='''this is a string with 'quotes' and more "quotes"'''
mystringliteral5='this is a string with \"quotes\"'
mystringliteral6='this is a string with \042quotes\042'
mystringliteral6='this is a string with \047quotes\047'

print mystringliteral1
print mystringliteral2
print mystringliteral3
print mystringliteral4
print mystringliteral5
print mystringliteral6

Which gives the following output:

this is a string with 'quotes'
this is a string with "quotes"
this is a string with "quotes" and more 'quotes'
this is a string with 'quotes' and more "quotes"
this is a string with "quotes"
this is a string with 'quotes'

Matt Sheppard ,Sep 11, 2008 at 8:40

I use double quotes in general, but not for any specific reason - Probably just out of habit from Java.

I guess you're also more likely to want apostrophes in an inline literal string than you are to want double quotes.

schwa ,Sep 28, 2008 at 3:35

Personally I stick with one or the other. It doesn't matter. And providing your own meaning to either quote is just to confuse other people when you collaborate.

maxpolk ,Sep 21, 2013 at 17:45

It's probably a stylistic preference more than anything. I just checked PEP 8 and didn't see any mention of single versus double quotes.

I prefer single quotes because its only one keystroke instead of two. That is, I don't have to mash the shift key to make single quote.

stivlo ,Mar 11, 2012 at 4:25

In Perl you want to use single quotes when you have a string which doesn't need to interpolate variables or escaped characters like \n, \t, \r, etc.

PHP makes the same distinction as Perl: content in single quotes will not be interpreted (not even \n will be converted), as opposed to double quotes which can contain variables to have their value printed out.

Python does not, I'm afraid. Technically seen, there is no $ token (or the like) to separate a name/text from a variable in Python. Both features make Python more readable, less confusing, after all. Single and double quotes can be used interchangeably in Python.

Andrew Dalke ,Mar 10, 2009 at 6:25

I chose to use double quotes because they are easier to see.

Alphy

add a comment ,Jul 22, 2009 at 19:50
I just use whatever strikes my fancy at the time; it's convenient to be able to switch between the two at a whim!

Of course, when quoting quote characetrs, switching between the two might not be so whimsical after all...

Vinko Vrsalovic ,Sep 11, 2008 at 8:25

Your team's taste or your project's coding guidelines.

If you are in a multilanguage environment, you might wish to encourage the use of the same type of quotes for strings that the other language uses, for instance. Else, I personally like best the look of '

Mario F ,Sep 11, 2008 at 8:28

None as far as I know. Although if you look at some code, " " is commonly used for strings of text (I guess ' is more common inside text than "), and ' ' appears in hashkeys and things like that.

Acumenus ,Apr 16, 2013 at 22:32

I aim to minimize both pixels and surprise. I typically prefer ' in order to minimize pixels, but " instead if the string has an apostrophe, again to minimize pixels. For a docstring, however, I prefer """ over ''' because the latter is non-standard, uncommon, and therefore surprising. If now I have a bunch of strings where I used " per the above logic, but also one that can get away with a ' , I may still use " in it to preserve consistency, only to minimize surprise.

Perhaps it helps to think of the pixel minimization philosophy in the following way. Would you rather that English characters looked like A B C or AA BB CC ? The latter choice wastes 50% of the non-empty pixels.

Philipp ,Jul 5, 2010 at 13:06

I use double quotes because I have been doing so for years in most languages (C++, Java, VB ) except Bash, because I also use double quotes in normal text and because I'm using a (modified) non-English keyboard where both characters require the shift key.

Adam Smith ,Dec 16, 2013 at 23:38

' = "

/ = \ = \\

example :

f = open('c:\word.txt', 'r')
f = open("c:\word.txt", "r")
f = open("c:/word.txt", "r")
f = open("c:\\\word.txt", "r")

Results are the same

=>> no, they're not the same. A single backslash will escape characters. You just happen to luck out in that example because \k and \w aren't valid escapes like \t or \n or \\ or \"

If you want to use single backslashes (and have them interpreted as such), then you need to use a "raw" string. You can do this by putting an ' r ' in front of the string

im_raw = r'c:\temp.txt'
non_raw = 'c:\\temp.txt'
another_way = 'c:/temp.txt'

As far as paths in Windows are concerned, forward slashes are interpreted the same way. Clearly the string itself is different though. I wouldn't guarantee that they're handled this way on an external device though.

[Sep 28, 2019] The Power of Python's String Templates

Sep 28, 2019 | www.thoughtco.com

Science, Tech, Math › Computer Science Python's String Templates

Print
python icon done in the Tango! style

The people from the Tango! project/Wikimedia Commons

Computer Science Computer Science View More by Al Lukaszewski Albert Lukaszewski, Ph.D., is a veteran computer programmer, software engineer, and author, who specializes in the Python language. Updated December 31, 2018 Python is an interpreted, object-oriented, high-level programming language . It is easy to learn because its syntax emphasizes readability, which reduces the expense of program maintenance. Many programmers love working with Python because -- without the compilation step -- testing and debugging go quickly.​ Python Web Templating Templating, especially web templating, represents data in forms usually intended to be readable by a viewer. The simplest form of a templating engine substitutes values into the template to produce the output. Aside from the string constants and the deprecated string functions, which moved to string methods, Python's string module also includes string templates. The template itself is a class that receives a string as its argument. The object instantiated from that class is called a template string object. Template strings were first introduced in Python 2.4. Where string formatting operators used the percentage sign for substitutions, the template object uses dollar signs.
  • $$ is an escape sequence; it is replaced with a single $ .
  • $<identifier> names a substitution placeholder matching a mapping key of <identifier>. By default, <identifier> must spell a Python identifier. The first non-identifier character after the $ character terminates this placeholder specification.
  • ${<identifier>} is equivalent to $<identifier>. It is required when valid identifier characters follow the placeholder but are not part of the placeholder, such as ${noun}ification.
Outside of these uses of the dollar sign, any appearance of $ causes a ValueError to be raised. The methods available through template strings are as follows:
  • Class string. Template ( template ): The constructor takes a single argument, which is the template string.
  • Substitute ( mapping, **keywords ): Method that substitutes the string values ( mapping) for the template string values. Mapping is a dictionary-like object, and its values may be accessed as a dictionary. If the keywords argument is used, it represents placeholders. Where both mapping and keywords are used, the latter takes precedence. If a placeholder is missing from mapping or keywords , a KeyError is thrown.
  • Safe _ substitute( mapping, **keywords ): Functions similarly to substitute(). However, if a placeholder is missing from mapping or keywords , the original placeholder is used by default, thus avoiding the KeyError. Also, any occurrence of "$" returns a dollar sign.
Template objects also have one publicly available attribute:
  • Template is the object passed to the constructor's template argument. While read-only access is not enforced, it is best not to change this attribute in your program.
The sample shell session below serves to illustrate template string objects.

>>> from string import Template

>>> s = Template('$when, $who $action $what.')

>>> s.substitute(when='In the summer', who='John', action='drinks', what='iced tea') 'In the summer, John drinks iced tea.'

>>> s.substitute(when='At night', who='Jean', action='eats', what='popcorn') 'At night, Jean eats popcorn.'

>>> s.template '$when, $who $action $what.'

>>> d = dict(when='in the summer')

>>> Template('$who $action $what $when').safe_substitute(d) '$who $action $what in the summer'

[Sep 28, 2019] python - Get command line arguments as string - Stack Overflow

Sep 28, 2019 | stackoverflow.com

Get command line arguments as string Ask Question Asked 3 years, 3 months ago Active 2 months ago Viewed 21k times 7 1


KocT9H ,Jun 6, 2016 at 12:58

I want to print all command line arguments as a single string. Example of how I call my script and what I expect to be printed:
./RunT.py mytst.tst -c qwerty.c

mytst.tst -c qwerty.c

The code that does that:

args = str(sys.argv[1:])
args = args.replace("[","")
args = args.replace("]","")
args = args.replace(",","")
args = args.replace("'","")
print args

I did all replaces because sys.argv[1:] returns this:

['mytst.tst', '-c', 'qwerty.c']

Is there a better way to get same result? I don't like those multiple replace calls

KocT9H ,Apr 5 at 12:16

An option:
import sys
' '.join(sys.argv[1:])

The join() function joins its arguments by whatever string you call it on. So ' '.join(...) joins the arguments with single spaces ( ' ' ) between them.

cxw ,Apr 5 at 12:18

The command line arguments are already handled by the shell before they are sent into sys.argv . Therefore, shell quoting and whitespace are gone and cannot be exactly reconstructed.

Assuming the user double-quotes strings with spaces, here's a python program to reconstruct the command string with those quotes.

commandstring = '';  

for arg in sys.argv[1:]:          # skip sys.argv[0] since the question didn't ask for it
    if ' ' in arg:
        commandstring+= '"{}"  '.format(arg) ;   # Put the quotes back in
    else:
        commandstring+="{}  ".format(arg) ;      # Assume no space => no quotes

print(commandstring);

For example, the command line

./saferm.py sdkf lsadkf -r sdf -f sdf -fs -s "flksjfksdkfj sdfsdaflkasdf"

will produce the same arguments as output:

sdkf lsadkf -r sdf -f sdf -fs -s "flksjfksdkfj sdfsdaflkasdf"

since the user indeed double-quoted only arguments with strings.

TryToSolveItSimple ,Apr 5 at 12:27

You're getting a list object with all of your arguments when you use the syntax [1:] which goes from the second argument to the last. You could run a for each loop to join them into one string:
args = sys.argv[1:]
result = ''

for arg in args:
    result += " " + arg

pushpen.paul ,Jul 13 at 1:39

None of the previous answers properly escape all possible arguments, like empty args or those containing quotes. The closest you can get with minimal code is to use shlex.quote (available since Python 3.3):
import shlex
cmdline = " ".join(map(shlex.quote, sys.argv[1:]))

> ,

Or can do format , and multiply the string by length of sys.argv :
(' {}'*len(sys.argv)).lstrip().format(*sys.argv)

Or can do % :

(' %s'*len(sys.argv)).lstrip()%sys.argv

[Sep 27, 2019] String Manipulation in Python

Sep 27, 2019 | www.pythonforbeginners.com

String Manipulation in Python Overview

A string is a list of characters in order. 

A character is anything you can type on the keyboard in one keystroke,
like a letter, a number, or a backslash. 

Strings can have spaces: "hello world". 

An empty string is a string that has 0 characters.

Python strings are immutable

Python recognize as strings everything that is delimited by quotation marks
(" " or ' ').
String Manipulation
To manipulate strings, we can use some of Pythons built-in methods.
Creation
word = "Hello World"

>>> print word
Hello World
Accessing

Use [ ] to access characters in a string
word = "Hello World"
letter=word[0]

>>> print letter
H
Length
word = "Hello World"

>>> len(word)
11
Finding
word = "Hello World"

>>> print word.count('l')      # count how many times l is in the string
3

>>> print word.find("H")     # find the word H in the string
0

>>> print word.index("World")        # find the letters World in the string
6
Count
s =  "Count, the number     of spaces"

>>> print s.count(' ')
8
Slicing
Use [ # : # ] to get set of letter

Keep in mind that python, as many other languages, starts to count from 0!!
word = "Hello World"

print word[0]          #get one char of the word
print word[0:1]        #get one char of the word (same as above)
print word[0:3]        #get the first three char
print word[:3]         #get the first three char
print word[-3:]        #get the last three char
print word[3:]         #get all but the three first char
print word[:-3]        #get all but the three last character
word = "Hello World"
 
word[start:end]         # items start through end-1
word[start:]            # items start through the rest of the list
word[:end]              # items from the beginning through end-1
word[:]                 # a copy of the whole list
Split Strings
word = "Hello World"

>>> word.split(' ')  # Split on whitespace
['Hello', 'World']
Startswith / Endswith
word = "hello world"

>>> word.startswith("H")
True

>>> word.endswith("d")
True

>>> word.endswith("w")
False
Repeat Strings
print "."* 10    # prints ten dots

>>> print "." * 10
..........
Replacing
word = "Hello World"

>>> word.replace("Hello", "Goodbye")
'Goodbye World'
Changing Upper and Lower Case Strings
string = "Hello World"

>>> print string.upper()
HELLO WORLD

>>> print string.lower()
hello world

>>> print string.title()
Hello World

>>> print string.capitalize()
Hello world

>>> print string.swapcase()
hELLO wORLD
Reversing
string = "Hello World"

>>> print ' '.join(reversed(string))
d l r o W   o l l e H
Strip

Python strings have the strip(), lstrip(), rstrip() methods for removing 
any character from both ends of a string. 

If the characters to be removed are not specified then white-space will be removed
word = "Hello World"
Strip off newline characters from end of the string
>>> print word.strip('
')
Hello World
strip()     #removes from both ends
lstrip()    #removes leading characters (Left-strip)
rstrip()    #removes trailing characters (Right-strip)

>>> word = "    xyz    "

>>> print word
    xyz    

>>> print word.strip()
xyz

>>> print word.lstrip()
xyz    

>>> print word.rstrip()
    xyz
Concatenation

To concatenate strings in Python use the "+" operator.
"Hello " + "World" # = "Hello World"
"Hello " + "World" + "!"# = "Hello World!"
Join
>>> print ":".join(word)  # #add a : between every char
H:e:l:l:o: :W:o:r:l:d

>>> print " ".join(word)  # add a whitespace between every char
H e l l o   W o r l d
Testing

A string in Python can be tested for truth value. 

The return type will be in Boolean value (True or False)
word = "Hello World"
 
word.isalnum()         #check if all char are alphanumeric 
word.isalpha()         #check if all char in the string are alphabetic
word.isdigit()         #test if string contains digits
word.istitle()         #test if string contains title words
word.isupper()         #test if string contains upper case
word.islower()         #test if string contains lower case
word.isspace()         #test if string contains spaces
word.endswith('d')     #test if string endswith a d
word.startswith('H')   #test if string startswith H
If you liked this article, please share it with your friends.

[Sep 27, 2019] 6.1. string -- Common string operations -- Python 3.4.10 documentation

Sep 27, 2019 | docs.python.org

Source code: Lib/string.py


See also

Text Sequence Type -- str

String Methods 6.1.1. String constants

The constants defined in this module are:

string. ascii_letters
The concatenation of the ascii_lowercase and ascii_uppercase constants described below. This value is not locale-dependent.
string. ascii_lowercase
The lowercase letters 'abcdefghijklmnopqrstuvwxyz' . This value is not locale-dependent and will not change.
string. ascii_uppercase
The uppercase letters 'ABCDEFGHIJKLMNOPQRSTUVWXYZ' . This value is not locale-dependent and will not change.
string. digits
The string '0123456789' .
string. hexdigits
The string '0123456789abcdefABCDEF' .
string. octdigits
The string '01234567' .
string. punctuation
String of ASCII characters which are considered punctuation characters in the C locale.
string. printable
String of ASCII characters which are considered printable. This is a combination of digits , ascii_letters , punctuation , and whitespace .
string. whitespace
A string containing all ASCII characters that are considered whitespace. This includes the characters space, tab, linefeed, return, formfeed, and vertical tab.
6.1.2. String Formatting

The built-in string class provides the ability to do complex variable substitutions and value formatting via the format() method described in PEP 3101 . The Formatter class in the string module allows you to create and customize your own string formatting behaviors using the same implementation as the built-in format() method.

class string. Formatter
The Formatter class has the following public methods:
format ( format_string , *args , **kwargs )
format() is the primary API method. It takes a format string and an arbitrary set of positional and keyword arguments. format() is just a wrapper that calls vformat() .
vformat ( format_string , args , kwargs )
This function does the actual work of formatting. It is exposed as a separate function for cases where you want to pass in a predefined dictionary of arguments, rather than unpacking and repacking the dictionary as individual arguments using the *args and **kwargs syntax. vformat() does the work of breaking up the format string into character data and replacement fields. It calls the various methods described below.

In addition, the Formatter defines a number of methods that are intended to be replaced by subclasses:

parse ( format_string )
Loop over the format_string and return an iterable of tuples ( literal_text , field_name , format_spec , conversion ). This is used by vformat() to break the string into either literal text, or replacement fields.

The values in the tuple conceptually represent a span of literal text followed by a single replacement field. If there is no literal text (which can happen if two replacement fields occur consecutively), then literal_text will be a zero-length string. If there is no replacement field, then the values of field_name , format_spec and conversion will be None .

get_field ( field_name , args , kwargs )
Given field_name as returned by parse() (see above), convert it to an object to be formatted. Returns a tuple (obj, used_key). The default version takes strings of the form defined in PEP 3101 , such as "0[name]" or "label.title". args and kwargs are as passed in to vformat() . The return value used_key has the same meaning as the key parameter to get_value() .
get_value ( key , args , kwargs )
Retrieve a given field value. The key argument will be either an integer or a string. If it is an integer, it represents the index of the positional argument in args ; if it is a string, then it represents a named argument in kwargs .

The args parameter is set to the list of positional arguments to vformat() , and the kwargs parameter is set to the dictionary of keyword arguments.

For compound field names, these functions are only called for the first component of the field name; Subsequent components are handled through normal attribute and indexing operations.

So for example, the field expression '0.name' would cause get_value() to be called with a key argument of 0. The name attribute will be looked up after get_value() returns by calling the built-in getattr() function.

If the index or keyword refers to an item that does not exist, then an IndexError or KeyError should be raised.

check_unused_args ( used_args , args , kwargs )
Implement checking for unused arguments if desired. The arguments to this function is the set of all argument keys that were actually referred to in the format string (integers for positional arguments, and strings for named arguments), and a reference to the args and kwargs that was passed to vformat. The set of unused args can be calculated from these parameters. check_unused_args() is assumed to raise an exception if the check fails.
format_field ( value , format_spec )
format_field() simply calls the global format() built-in. The method is provided so that subclasses can override it.
convert_field ( value , conversion )
Converts the value (returned by get_field() ) given a conversion type (as in the tuple returned by the parse() method). The default version understands 's' (str), 'r' (repr) and 'a' (ascii) conversion types.
6.1.3. Format String Syntax

The str.format() method and the Formatter class share the same syntax for format strings (although in the case of Formatter , subclasses can define their own format string syntax).

Format strings contain "replacement fields" surrounded by curly braces {} . Anything that is not contained in braces is considered literal text, which is copied unchanged to the output. If you need to include a brace character in the literal text, it can be escaped by doubling: {{ and }} .

The grammar for a replacement field is as follows:

replacement_field ::=  "{" [field_name] ["!" conversion] [":" format_spec] "}"
field_name        ::=  arg_name ("." attribute_name | "[" element_index "]")*
arg_name          ::=  [identifier | integer]
attribute_name    ::=  identifier
element_index     ::=  integer | index_string
index_string      ::=  <any source character except "]"> +
conversion        ::=  "r" | "s" | "a"
format_spec       ::=  <described in the next section>

In less formal terms, the replacement field can start with a field_name that specifies the object whose value is to be formatted and inserted into the output instead of the replacement field. The field_name is optionally followed by a conversion field, which is preceded by an exclamation point '!' , and a format_spec , which is preceded by a colon ':' . These specify a non-default format for the replacement value.

See also the Format Specification Mini-Language section.

The field_name itself begins with an arg_name that is either a number or a keyword. If it's a number, it refers to a positional argument, and if it's a keyword, it refers to a named keyword argument. If the numerical arg_names in a format string are 0, 1, 2, ... in sequence, they can all be omitted (not just some) and the numbers 0, 1, 2, ... will be automatically inserted in that order. Because arg_name is not quote-delimited, it is not possible to specify arbitrary dictionary keys (e.g., the strings '10' or ':-]' ) within a format string. The arg_name can be followed by any number of index or attribute expressions. An expression of the form '.name' selects the named attribute using getattr() , while an expression of the form '[index]' does an index lookup using __getitem__() .

Changed in version 3.1: The positional argument specifiers can be omitted, so '{} {}' is equivalent to '{0} {1}' .

Some simple format string examples:

"First, thou shalt count to {0}" # References first positional argument
"Bring me a {}"                  # Implicitly references the first positional argument
"From {} to {}"                  # Same as "From {0} to {1}"
"My quest is {name}"             # References keyword argument 'name'
"Weight in tons {0.weight}"      # 'weight' attribute of first positional arg
"Units destroyed: {players[0]}"  # First element of keyword argument 'players'.

The conversion field causes a type coercion before formatting. Normally, the job of formatting a value is done by the __format__() method of the value itself. However, in some cases it is desirable to force a type to be formatted as a string, overriding its own definition of formatting. By converting the value to a string before calling __format__() , the normal formatting logic is bypassed.

Three conversion flags are currently supported: '!s' which calls str() on the value, '!r' which calls repr() and '!a' which calls ascii() .

Some examples:

"Harold's a clever {0!s}"        # Calls str() on the argument first
"Bring out the holy {name!r}"    # Calls repr() on the argument first
"More {!a}"                      # Calls ascii() on the argument first

The format_spec field contains a specification of how the value should be presented, including such details as field width, alignment, padding, decimal precision and so on. Each value type can define its own "formatting mini-language" or interpretation of the format_spec .

Most built-in types support a common formatting mini-language, which is described in the next section.

A format_spec field can also include nested replacement fields within it. These nested replacement fields can contain only a field name; conversion flags and format specifications are not allowed. The replacement fields within the format_spec are substituted before the format_spec string is interpreted. This allows the formatting of a value to be dynamically specified.

See the Format examples section for some examples.

6.1.3.1. Format Specification Mini-Language

"Format specifications" are used within replacement fields contained within a format string to define how individual values are presented (see Format String Syntax ). They can also be passed directly to the built-in format() function. Each formattable type may define how the format specification is to be interpreted.

Most built-in types implement the following options for format specifications, although some of the formatting options are only supported by the numeric types.

A general convention is that an empty format string ( "" ) produces the same result as if you had called str() on the value. A non-empty format string typically modifies the result.

The general form of a standard format specifier is:

format_spec ::=  [[fill]align][sign][#][0][width][,][.precision][type]
fill        ::=  <any character>
align       ::=  "<" | ">" | "=" | "^"
sign        ::=  "+" | "-" | " "
width       ::=  integer
precision   ::=  integer
type        ::=  "b" | "c" | "d" | "e" | "E" | "f" | "F" | "g" | "G" | "n" | "o" | "s" | "x" | "X" | "%"

If a valid align value is specified, it can be preceded by a fill character that can be any character and defaults to a space if omitted. Note that it is not possible to use { and } as fill char while using the str.format() method; this limitation however doesn't affect the format() function.

The meaning of the various alignment options is as follows:

Option Meaning
'<' Forces the field to be left-aligned within the available space (this is the default for most objects).
'>' Forces the field to be right-aligned within the available space (this is the default for numbers).
'=' Forces the padding to be placed after the sign (if any) but before the digits. This is used for printing fields in the form '+000000120'. This alignment option is only valid for numeric types.
'^' Forces the field to be centered within the available space.

Note that unless a minimum field width is defined, the field width will always be the same size as the data to fill it, so that the alignment option has no meaning in this case.

The sign option is only valid for number types, and can be one of the following:

Option Meaning
'+' indicates that a sign should be used for both positive as well as negative numbers.
'-' indicates that a sign should be used only for negative numbers (this is the default behavior).
space indicates that a leading space should be used on positive numbers, and a minus sign on negative numbers.

The '#' option causes the "alternate form" to be used for the conversion. The alternate form is defined differently for different types. This option is only valid for integer, float, complex and Decimal types. For integers, when binary, octal, or hexadecimal output is used, this option adds the prefix respective '0b' , '0o' , or '0x' to the output value. For floats, complex and Decimal the alternate form causes the result of the conversion to always contain a decimal-point character, even if no digits follow it. Normally, a decimal-point character appears in the result of these conversions only if a digit follows it. In addition, for 'g' and 'G' conversions, trailing zeros are not removed from the result.

The ',' option signals the use of a comma for a thousands separator. For a locale aware separator, use the 'n' integer presentation type instead.

Changed in version 3.1: Added the ',' option (see also PEP 378 ).

width is a decimal integer defining the minimum field width. If not specified, then the field width will be determined by the content.

Preceding the width field by a zero ( '0' ) character enables sign-aware zero-padding for numeric types. This is equivalent to a fill character of '0' with an alignment type of '=' .

The precision is a decimal number indicating how many digits should be displayed after the decimal point for a floating point value formatted with 'f' and 'F' , or before and after the decimal point for a floating point value formatted with 'g' or 'G' . For non-number types the field indicates the maximum field size - in other words, how many characters will be used from the field content. The precision is not allowed for integer values.

Finally, the type determines how the data should be presented.

The available string presentation types are:

Type Meaning
's' String format. This is the default type for strings and may be omitted.
None The same as 's' .

The available integer presentation types are:

Type Meaning
'b' Binary format. Outputs the number in base 2.
'c' Character. Converts the integer to the corresponding unicode character before printing.
'd' Decimal Integer. Outputs the number in base 10.
'o' Octal format. Outputs the number in base 8.
'x' Hex format. Outputs the number in base 16, using lower- case letters for the digits above 9.
'X' Hex format. Outputs the number in base 16, using upper- case letters for the digits above 9.
'n' Number. This is the same as 'd' , except that it uses the current locale setting to insert the appropriate number separator characters.
None The same as 'd' .

In addition to the above presentation types, integers can be formatted with the floating point presentation types listed below (except 'n' and None). When doing so, float() is used to convert the integer to a floating point number before formatting.

The available presentation types for floating point and decimal values are:

Type Meaning
'e' Exponent notation. Prints the number in scientific notation using the letter 'e' to indicate the exponent. The default precision is 6 .
'E' Exponent notation. Same as 'e' except it uses an upper case 'E' as the separator character.
'f' Fixed point. Displays the number as a fixed-point number. The default precision is 6 .
'F' Fixed point. Same as 'f' , but converts nan to NAN and inf to INF .
'g'

General format. For a given precision p >= 1 , this rounds the number to p significant digits and then formats the result in either fixed-point format or in scientific notation, depending on its magnitude.

The precise rules are as follows: suppose that the result formatted with presentation type 'e' and precision p-1 would have exponent exp . Then if -4 <= exp < p , the number is formatted with presentation type 'f' and precision p-1-exp . Otherwise, the number is formatted with presentation type 'e' and precision p-1 . In both cases insignificant trailing zeros are removed from the significand, and the decimal point is also removed if there are no remaining digits following it.

Positive and negative infinity, positive and negative zero, and nans, are formatted as inf , -inf , 0 , -0 and nan respectively, regardless of the precision.

A precision of 0 is treated as equivalent to a precision of 1 . The default precision is 6 .

'G' General format. Same as 'g' except switches to 'E' if the number gets too large. The representations of infinity and NaN are uppercased, too.
'n' Number. This is the same as 'g' , except that it uses the current locale setting to insert the appropriate number separator characters.
'%' Percentage. Multiplies the number by 100 and displays in fixed ( 'f' ) format, followed by a percent sign.
None Similar to 'g' , except that fixed-point notation, when used, has at least one digit past the decimal point. The default precision is as high as needed to represent the particular value. The overall effect is to match the output of str() as altered by the other format modifiers.
6.1.3.2. Format examples

This section contains examples of the new format syntax and comparison with the old % -formatting.

In most of the cases the syntax is similar to the old % -formatting, with the addition of the {} and with : used instead of % . For example, '%03.2f' can be translated to '{:03.2f}' .

The new format syntax also supports new and different options, shown in the follow examples.

Accessing arguments by position:

>>>
>>> '{0}, {1}, {2}'.format('a', 'b', 'c')
'a, b, c'
>>> '{}, {}, {}'.format('a', 'b', 'c')  # 3.1+ only
'a, b, c'
>>> '{2}, {1}, {0}'.format('a', 'b', 'c')
'c, b, a'
>>> '{2}, {1}, {0}'.format(*'abc')      # unpacking argument sequence
'c, b, a'
>>> '{0}{1}{0}'.format('abra', 'cad')   # arguments' indices can be repeated
'abracadabra'

Accessing arguments by name:

>>>
>>> 'Coordinates: {latitude}, {longitude}'.format(latitude='37.24N', longitude='-115.81W')
'Coordinates: 37.24N, -115.81W'
>>> coord = {'latitude': '37.24N', 'longitude': '-115.81W'}
>>> 'Coordinates: {latitude}, {longitude}'.format(**coord)
'Coordinates: 37.24N, -115.81W'

Accessing arguments' attributes:

>>>
>>> c = 3-5j
>>> ('The complex number {0} is formed from the real part {0.real} '
...  'and the imaginary part {0.imag}.').format(c)
'The complex number (3-5j) is formed from the real part 3.0 and the imaginary part -5.0.'
>>> class Point:
...     def __init__(self, x, y):
...         self.x, self.y = x, y
...     def __str__(self):
...         return 'Point({self.x}, {self.y})'.format(self=self)
...
>>> str(Point(4, 2))
'Point(4, 2)'

Accessing arguments' items:

>>>
>>> coord = (3, 5)
>>> 'X: {0[0]};  Y: {0[1]}'.format(coord)
'X: 3;  Y: 5'

Replacing %s and %r :

>>>
>>> "repr() shows quotes: {!r}; str() doesn't: {!s}".format('test1', 'test2')
"repr() shows quotes: 'test1'; str() doesn't: test2"

Aligning the text and specifying a width:

>>>
>>> '{:<30}'.format('left aligned')
'left aligned                  '
>>> '{:>30}'.format('right aligned')
'                 right aligned'
>>> '{:^30}'.format('centered')
'           centered           '
>>> '{:*^30}'.format('centered')  # use '*' as a fill char
'***********centered***********'

Replacing %+f , %-f , and % f and specifying a sign:

>>>
>>> '{:+f}; {:+f}'.format(3.14, -3.14)  # show it always
'+3.140000; -3.140000'
>>> '{: f}; {: f}'.format(3.14, -3.14)  # show a space for positive numbers
' 3.140000; -3.140000'
>>> '{:-f}; {:-f}'.format(3.14, -3.14)  # show only the minus -- same as '{:f}; {:f}'
'3.140000; -3.140000'

Replacing %x and %o and converting the value to different bases:

>>>
>>> # format also supports binary numbers
>>> "int: {0:d};  hex: {0:x};  oct: {0:o};  bin: {0:b}".format(42)
'int: 42;  hex: 2a;  oct: 52;  bin: 101010'
>>> # with 0x, 0o, or 0b as prefix:
>>> "int: {0:d};  hex: {0:#x};  oct: {0:#o};  bin: {0:#b}".format(42)
'int: 42;  hex: 0x2a;  oct: 0o52;  bin: 0b101010'

Using the comma as a thousands separator:

>>>
>>> '{:,}'.format(1234567890)
'1,234,567,890'

Expressing a percentage:

>>>
>>> points = 19
>>> total = 22
>>> 'Correct answers: {:.2%}'.format(points/total)
'Correct answers: 86.36%'

Using type-specific formatting:

>>>
>>> import datetime
>>> d = datetime.datetime(2010, 7, 4, 12, 15, 58)
>>> '{:%Y-%m-%d %H:%M:%S}'.format(d)
'2010-07-04 12:15:58'

Nesting arguments and more complex examples:

>>>
>>> for align, text in zip('<^>', ['left', 'center', 'right']):
...     '{0:{fill}{align}16}'.format(text, fill=align, align=align)
...
'left<<<<<<<<<<<<'
'^^^^^center^^^^^'
'>>>>>>>>>>>right'
>>>
>>> octets = [192, 168, 0, 1]
>>> '{:02X}{:02X}{:02X}{:02X}'.format(*octets)
'C0A80001'
>>> int(_, 16)
3232235521
>>>
>>> width = 5
>>> for num in range(5,12): 
...     for base in 'dXob':
...         print('{0:{width}{base}}'.format(num, base=base, width=width), end=' ')
...     print()
...
    5     5     5   101
    6     6     6   110
    7     7     7   111
    8     8    10  1000
    9     9    11  1001
   10     A    12  1010
   11     B    13  1011
6.1.4. Template strings

Templates provide simpler string substitutions as described in PEP 292 . Instead of the normal % -based substitutions, Templates support $ -based substitutions, using the following rules:

  • $$ is an escape; it is replaced with a single $ .
  • $identifier names a substitution placeholder matching a mapping key of "identifier" . By default, "identifier" is restricted to any case-insensitive ASCII alphanumeric string (including underscores) that starts with an underscore or ASCII letter. The first non-identifier character after the $ character terminates this placeholder specification.
  • ${identifier} is equivalent to $identifier . It is required when valid identifier characters follow the placeholder but are not part of the placeholder, such as "${noun}ification" .

Any other appearance of $ in the string will result in a ValueError being raised.

The string module provides a Template class that implements these rules. The methods of Template are:

class string. Template ( template )
The constructor takes a single argument which is the template string.
substitute ( mapping , **kwds )
Performs the template substitution, returning a new string. mapping is any dictionary-like object with keys that match the placeholders in the template. Alternatively, you can provide keyword arguments, where the keywords are the placeholders. When both mapping and kwds are given and there are duplicates, the placeholders from kwds take precedence.
safe_substitute ( mapping , **kwds )
Like substitute() , except that if placeholders are missing from mapping and kwds , instead of raising a KeyError exception, the original placeholder will appear in the resulting string intact. Also, unlike with substitute() , any other appearances of the $ will simply return $ instead of raising ValueError .

While other exceptions may still occur, this method is called "safe" because substitutions always tries to return a usable string instead of raising an exception. In another sense, safe_substitute() may be anything other than safe, since it will silently ignore malformed templates containing dangling delimiters, unmatched braces, or placeholders that are not valid Python identifiers.

Template instances also provide one public data attribute:

template
This is the object passed to the constructor's template argument. In general, you shouldn't change it, but read-only access is not enforced.

Here is an example of how to use a Template:

>>>
>>> from string import Template
>>> s = Template('$who likes $what')
>>> s.substitute(who='tim', what='kung pao')
'tim likes kung pao'
>>> d = dict(who='tim')
>>> Template('Give $who $100').substitute(d)
Traceback (most recent call last):
...
ValueError: Invalid placeholder in string: line 1, col 11
>>> Template('$who likes $what').substitute(d)
Traceback (most recent call last):
...
KeyError: 'what'
>>> Template('$who likes $what').safe_substitute(d)
'tim likes $what'

Advanced usage: you can derive subclasses of Template to customize the placeholder syntax, delimiter character, or the entire regular expression used to parse template strings. To do this, you can override these class attributes:

  • delimiter – This is the literal string describing a placeholder introducing delimiter. The default value is $ . Note that this should not be a regular expression, as the implementation will call re.escape() on this string as needed.

  • idpattern – This is the regular expression describing the pattern for non-braced placeholders (the braces will be added automatically as appropriate). The default value is the regular expression [_a-z][_a-z0-9]* .

  • flags – The regular expression flags that will be applied when compiling the regular expression used for recognizing substitutions. The default value is re.IGNORECASE . Note that re.VERBOSE will always be added to the flags, so custom idpattern s must follow conventions for verbose regular expressions.

    New in version 3.2.

Alternatively, you can provide the entire regular expression pattern by overriding the class attribute pattern . If you do this, the value must be a regular expression object with four named capturing groups. The capturing groups correspond to the rules given above, along with the invalid placeholder rule:

  • escaped – This group matches the escape sequence, e.g. $$ , in the default pattern.
  • named – This group matches the unbraced placeholder name; it should not include the delimiter in capturing group.
  • braced – This group matches the brace enclosed placeholder name; it should not include either the delimiter or braces in the capturing group.
  • invalid – This group matches any other delimiter pattern (usually a single delimiter), and it should appear last in the regular expression.
6.1.5. Helper functions
string. capwords ( s , sep=None )
Split the argument into words using str.split() , capitalize each word using str.capitalize() , and join the capitalized words using str.join() . If the optional second argument sep is absent or None , runs of whitespace characters are replaced by a single space and leading and trailing whitespace are removed, otherwise sep is used to split and join the words.

[Jun 21, 2019] The War on Normal People The Truth About America s Disappearing Jobs and Why Universal Basic Income Is Our Future Andrew Yang

Looks like this guys somewhat understands the problems with neoliberalism, but still is captured by neoliberal ideology.
Notable quotes:
"... That all seems awfully quaint today. Pensions disappeared for private-sector employees years ago. Most community banks were gobbled up by one of the mega-banks in the 1990s -- today five banks control 50 percent of the commercial banking industry, which itself mushroomed to the point where finance enjoys about 25 percent of all corporate profits. Union membership fell by 50 percent. ..."
"... Ninety-four percent of the jobs created between 2005 and 2015 were temp or contractor jobs without benefits; people working multiple gigs to make ends meet is increasingly the norm. Real wages have been flat or even declining. The chances that an American born in 1990 will earn more than their parents are down to 50 percent; for Americans born in 1940 the same figure was 92 percent. ..."
"... Thanks to Milton Friedman, Jack Welch, and other corporate titans, the goals of large companies began to change in the 1970s and early 1980s. The notion they espoused -- that a company exists only to maximize its share price -- became gospel in business schools and boardrooms around the country. Companies were pushed to adopt shareholder value as their sole measuring stick. ..."
"... Simultaneously, the major banks grew and evolved as Depression-era regulations separating consumer lending and investment banking were abolished. Financial deregulation started under Ronald Reagan in 1980 and culminated in the Financial Services Modernization Act of 1999 under Bill Clinton that really set the banks loose. The securities industry grew 500 percent as a share of GDP between 1980 and the 2000s while ordinary bank deposits shrank from 70 percent to 50 percent. Financial products multiplied as even Main Street companies were driven to pursue financial engineering to manage their affairs. GE, my dad's old company and once a beacon of manufacturing, became the fifth biggest financial institution in the country by 2007. ..."
Apr 27, 2019 | www.amazon.com

The logic of the meritocracy is leading us to ruin, because we arc collectively primed to ignore the voices of the millions getting pushed into economic distress by the grinding wheels of automation and innovation. We figure they're complaining or suffering because they're losers.

We need to break free of this logic of the marketplace before it's too late.

[Neoliberalism] had decimated the economies and cultures of these regions and were set to do the same to many others.

In response, American lives and families are falling apart. Ram- pant financial stress is the new normal. We are in the third or fourth inning of the greatest economic shift in the history of mankind, and no one seems to be talking about it or doing anything in response.

The Great Displacement didn't arrive overnight. It has been building for decades as the economy and labor market changed in response to improving technology, financialization, changing corporate norms, and globalization. In the 1970s, when my parents worked at GE and Blue Cross Blue Shield in upstate New York, their companies provided generous pensions and expected them to stay for decades. Community banks were boring businesses that lent money to local companies for a modest return. Over 20 percent of workers were unionized. Some economic problems existed -- growth was uneven and infla- tion periodically high. But income inequality was low, jobs provided benefits, and Main Street businesses were the drivers of the economy. There were only three television networks, and in my house we watched them on a TV with an antenna that we fiddled with to make the picture clearer.

That all seems awfully quaint today. Pensions disappeared for private-sector employees years ago. Most community banks were gobbled up by one of the mega-banks in the 1990s -- today five banks control 50 percent of the commercial banking industry, which itself mushroomed to the point where finance enjoys about 25 percent of all corporate profits. Union membership fell by 50 percent.

Ninety-four percent of the jobs created between 2005 and 2015 were temp or contractor jobs without benefits; people working multiple gigs to make ends meet is increasingly the norm. Real wages have been flat or even declining. The chances that an American born in 1990 will earn more than their parents are down to 50 percent; for Americans born in 1940 the same figure was 92 percent.

Thanks to Milton Friedman, Jack Welch, and other corporate titans, the goals of large companies began to change in the 1970s and early 1980s. The notion they espoused -- that a company exists only to maximize its share price -- became gospel in business schools and boardrooms around the country. Companies were pushed to adopt shareholder value as their sole measuring stick.

Hostile takeovers, shareholder lawsuits, and later activist hedge funds served as prompts to ensure that managers were committed to profitability at all costs. On the flip side, CF.Os were granted stock options for the first time that wedded their individual gain to the company's share price. The ratio of CF.O to worker pay rose from 20 to 1 in 1965 to 271 to 1 in 2016. Benefits were streamlined and reduced and the relationship between company and employee weakened to become more transactional.

Simultaneously, the major banks grew and evolved as Depression-era regulations separating consumer lending and investment banking were abolished. Financial deregulation started under Ronald Reagan in 1980 and culminated in the Financial Services Modernization Act of 1999 under Bill Clinton that really set the banks loose. The securities industry grew 500 percent as a share of GDP between 1980 and the 2000s while ordinary bank deposits shrank from 70 percent to 50 percent. Financial products multiplied as even Main Street companies were driven to pursue financial engineering to manage their affairs. GE, my dad's old company and once a beacon of manufacturing, became the fifth biggest financial institution in the country by 2007.

Nolia Nessa , April 5, 2018

profound and urgent work of social criticism

It's hard to be in the year 2018 and not hear about the endless studies alarming the general public about coming labor automation. But what Yang provides in this book is two key things: automation has already been ravaging the country which has led to the great political polarization of today, and second, an actual vision into what happens when people lose jobs, and it definitely is a lightning strike of "oh crap"

I found this book relatively impressive and frightening. Yang, a former lawyer, entrepreneur, and non-profit leader, writes showing with inarguable data that when companies automate work and use new software, communities die, drug use increases, suicide increases, and crime skyrockets. The new jobs created go to big cities, the surviving talent leaves, and the remaining people lose hope and descend into madness. (as a student of psychology, this is not surprising)

He starts by painting the picture of the average American and how fragile they are economically. He deconstructs the labor predictions and how technology is going to ravage it. He discusses the future of work. He explains what has happened in technology and why it's suddenly a huge threat. He shows what this means: economic inequality rises, the people have less power, the voice of democracy is diminished, no one owns stocks, people get poorer etc. He shows that talent is leaving small towns, money is concentrating to big cities faster. He shows what happens when those other cities die (bad things), and then how the people react when they have no income (really bad things). He shows how retraining doesn't work and college is failing us. We don't invest in vocational skills, and our youth is underemployed pushed into freelance work making minimal pay. He shows how no one trusts the institutions anymore.

Then he discusses solutions with a focus on Universal Basic Income. I was a skeptic of the idea until I read this book. You literally walk away with this burning desire to prevent a Mad Max esque civil war, and its hard to argue with him. We don't have much time and our bloated micromanaged welfare programs cannot sustain.

[Jun 18, 2019] The Party s Over Oil, War and the Fate of Industrial Societies by Richard Heinberg

Jun 08, 2019 | www.amazon.com

The world is about to run out of cheap oil and change dramatically. Within the next few years, global production will peak. Thereafter, even if industrial societies begin to switch to alternative energy sources, they will have less net energy each year to do all the work essential to the survival of complex societies. We are entering a new era, as different from the industrial era as the latter was from medieval times.

In The Party's Over , Richard Heinberg places this momentous transition in historical context, showing how industrialism arose from the harnessing of fossil fuels, how competition to control access to oil shaped the geopolitics of the twentieth century and how contention for dwindling energy resources in the twenty-first century will lead to resource wars in the Middle East, Central Asia and South America. He describes the likely impacts of oil depletion and all of the energy alternatives. Predicting chaos unless the United States -- the world's foremost oil consumer -- is willing to join with other countries to implement a global program of resource conservation and sharing, he also recommends a "managed collapse" that might make way for a slower-paced, low-energy, sustainable society in the future.

More readable than other accounts of this issue, with fuller discussion of the context, social implications and recommendations for personal, community, national and global action, Heinberg's updated book is a riveting wake-up call for human-kind as the oil era winds down, and a critical tool for understanding and influencing current US foreign policy.

Richard Heinberg , from Santa Rosa, California, has been writing about energy resources issues and the dynamics of cultural change for many years. A member of the core faculty at New College of California, he is an award-winning author of three previous books. His Museletter was nominated for the Best Alternative Newsletter award by Utne in 1993.


Laura Lea Evans , April 20, 2013

love and hate

Well, how to describe something that is so drastic in predictions as to make one quiver? Heinberg spells out a future for humans that is not very optimistic but sadly, is more accurate than any of us would like. The information and research done by the author is first rate and irrefutable, which is as it should be. The news: dire. This is my first in a series of his work and indeed, it's a love/hate experience since there is a lot of hopelessness in the outcome of our current path. Be that as it may, this is a book to cherish and an author to admire.

Scott Forbes , May 31, 2005
This book will make you think differently about energy

Surprizingly its not about the rising cost of the energy that you personally use. Its about the whole economy that has been built on using a non-replenishable energy supply. You know how those economists always count on the 3% growth in the GDP. Well the book argues that this long term growth is fundamentally driven by our long term growth in energy usage, which everyone knows will have to turn around at some point.

The other surprizing fact is that the turning point is long before you run out of oil. Heinberg shows data that indicates that half of the oil is still left in the ground when the returns start to diminish. And it appears that that we are within a few years of reaching that point.

So we've used up about half the "available" ( i.e. feasible to extract from an energy perspective ) oil. Now oil production starts to decrease. What happens next is anyone's guess, but Heinburg presents some detailed discussions on the possiblities. Don't assume that a coal, nuclear, or "hydrogen" economy are going to be as easy and profitable as the petroleum economy we are leaving behind.

I've read lots of books about energy and the environment, and this is definitely one of the best.

B. King , November 22, 2003
An Industrial Strength Critique of Energy Usage

Part history and part prophesy, this book is an outstanding summary of many major issues facing Western industrial society. Author Richard Heinberg provides a scholarly critique of modern industrialism, focusing on its current use of energy, and a sobering forecast based on predictable trends.

The key point of the book is that the Earth's crust can provide mankind with an essentially finite amount of fossil fuel energy, with primary reference to oil. Drawing on the relatively unknown, and oft-misunderstood, concept of "peak oil," the book addresses the imminent shortfall of petroleum that will not be available on world markets. That day of reckoning is far closer than most people think. "Peak oil" is a global application of Geologist M. King Hubbert's (1903-1989) studies of oil production in "mature" exploration districts. That is, exploration for oil in sedimentary basins at first yields substantial discoveries, which are then produced. Additional exploration yields less and less "new" oil discovered, and that level of discovery coming at greater and greater effort. Eventually, absent additional significant discovery, production "peaks" and then commences an irreversible decline.

This has already occurred in the U.S. in the 1970's, and is in the process of occurring in oil-producing nations such as Mexico, Britain, Egypt, Indonesia and Malaysia. Ominously, "peak" production can be forecast in the next few years in such significant producing nations as Saudi Arabia and Iraq (in addition to all of the other problems in those unfortunate nations.)

Much of the rise of industrial society was tied to increasing availability of high energy-density fuel, particularly oil. Western society, and its imitators in non-Western lands, is based upon access to large amounts of energy-dense fuel, and that fuel is oil. With respect to the U.S., the domestic decline in oil production has been made up, over the past thirty years, by increasing imports from other locales, with concomitant political risk. When the world production "peaks" in the next few years, the competition for energy sources will become more fierce than it already is. This book addresses issues related to what are commonly thought of as "substitutes" for oil, such as coal, natural gas and natural gas liquids, and shatters many myths. The author also delves deeply into energy sources such as "tar sand," "oil shale," nuclear and renewable sources. And thankfully, the author offers a number of proposals to address the looming problem (although these proposals are probably not what an awful lot of people want to hear.)

A book like this one could easily descend into a tawdry level of "chicken-little" squawks and utter tendentiousness. But thankfully it does not do so. This is a mature, well-reasoned and carefully footnoted effort. I could take issue with some of the author's points about "big business" and how decisions are made at high political levels, but not in this review. Instead I will simply congratulate Mr. Heinberg for writing an important contribution to social discourse. I hope that a lot of people read this book and start to look at and think about the world differently.

This Hippy Gorilla , July 19, 2006
Cogent, timely, largely ignored

Maybe the most important book since Charles Darwin's "The Origin of Species". This volume represents THE wakeup call for a world society quite literally addicted to crude oil for its continuation, and, in most cases, it's very survival.

Heinberg has done his homework, and this volume should be required reading for anyone in an industrialized nation, or one just getting started down that road. It is a proven scientific fact that within a few years, we will begin to run out of oil, and it will be pretty much gone within 5 or 6 decades. Considering that we have built our entire society around an oil economy, the implications are dire - far, far beyond not being able to drive through the coffee shop with the kids in your SUV on the way home from the mall. Alternative energy sources? Dream on - read on.

The book is thoroughly researched, well-thought and organized and presents the often dissenting views at every side of this hugely important issue. It is also delightfully written and composed, and is fun and quick to read.

I highly recommend this book, and I hope at least one person reads what I'm writing and buys this book. And I hope they tell someone, too.

[Jun 14, 2019] The Twilight of Equality Neoliberalism, Cultural Politics, and the Attack on Democracy by Lisa Duggan

Notable quotes:
"... For example, she discusses neoliberal attempts to be "multicultural," but points out that economic resources are constantly redistributed upward. Neoliberal politics, she argues, has only reinforced and increased the divide between economic and social political issues. ..."
"... Because neoliberal politicians wish to save neoliberalism by reforming it, she argues that proposing alternate visions and ideas have been blocked. ..."
Jun 14, 2019 | www.amazon.com

S. Baker 5.0 out of 5 stars Summary/Review of Twilight of Equality November 27, 2007

Duggan articulately connects social and economic issues to each other, arguing that neoliberal politics have divided the two when in actuality, they cannot be separated from one another.

In the introduction, Duggan argues that politics have become neoliberal - while politics operate under the guise of promoting social change or social stability, in reality, she argues, politicians have failed to make the connection between economic and social/cultural issues. She uses historical background to prove the claim that economic and social issues can be separated from each other is false.

For example, she discusses neoliberal attempts to be "multicultural," but points out that economic resources are constantly redistributed upward. Neoliberal politics, she argues, has only reinforced and increased the divide between economic and social political issues.

After the introduction, Duggan focuses on a specific topic in each chapter: downsizing democracy, the incredible shrinking public, equality, and love and money. In the first chapter (downsizing democracy), she argues that through violent imperial assertion in the Middle East, budget cuts in social services, and disillusionments in political divides, "capitalists could actually bring down capitalism" (p. 2).

Because neoliberal politicians wish to save neoliberalism by reforming it, she argues that proposing alternate visions and ideas have been blocked. Duggan provides historical background that help the reader connect early nineteenth century U.S. legislation (regarding voting rights and slavery) to perpetuated institutional prejudices.

[Jun 14, 2019] Mean Girl Ayn Rand and the Culture of Greed by Lisa Duggan

Notable quotes:
"... From the 1980s to 2008, neoliberal politics and policies succeeded in expanding inequality around the world. The political climate Ayn Rand celebrated—the reign of brutal capitalism—intensified. Though Ayn Rand’s popularity took off in the 1940s, her reputation took a dive during the 1960s and ’70s. Then after her death in 1982, during the neoliberal administrations of Ronald Reagan in the United States and Margaret Thatcher in the United Kingdom, her star rose once more. (See chapter 4 for a full discussion of the rise of neoliberalism.) ..."
"... During the global economic crisis of 2008 it seemed that the neoliberal order might collapse. It lived on, however, in zombie form as discredited political policies and financial practices were restored. ..."
"... We are in the midst of a major global, political, economic, social, and cultural transition — but we don’t yet know which way we’re headed. The incoherence of the Trump administration is symptomatic of the confusion as politicians and business elites jockey with the Breitbart alt-right forces while conservative evangelical Christians pull strings. The unifying threads are meanness and greed, and the spirit of the whole hodgepodge is Ayn Rand. ..."
"... The current Trump administration is stuffed to the gills with Rand acolytes. Trump himself identifies with Fountainhead character Howard Roark; former secretary of state Rex Tillerson listed Adas Shrugged as his favorite book in a Scouting magazine feature; his replacement Mike Pompeo has been inspired by Rand since his youth. Ayn Rand’s influence is ascendant across broad swaths of our dominant political culture — including among public figures who see her as a key to the Zeitgeist, without having read a worth of her writing.’’ ..."
"... Rand biographer Jennifer Burns asserts simply that Ayn Rand's fiction is “the gateway drug” to right-wing politics in the United States — although her influence extends well beyond the right wing ..."
"... The resulting Randian sense of life might be called “optimistic cruelty.” Optimistic cruelty is the sense of life for the age of greed. ..."
"... The Fountainhead and especially Atlas Shrugged fabricate history and romanticize violence and domination in ways that reflect, reshape, and reproduce narratives of European superiority' and American virtue. ..."
"... It is not an accident that the novels’ fans, though gender mixed, are overwhelmingly white Americans of the professional, managerial, creative, and business classes." ..."
"... Does the pervasive cruelty of today's ruling classes shock you? Or, at least give you pause from time to time? Are you surprised by the fact that our elected leaders seem to despise people who struggle, people whose lives are not cushioned and shaped by inherited wealth, people who must work hard at many jobs in order to scrape by? If these or any of a number of other questions about the social proclivities of our contemporary ruling class detain you for just two seconds, this is the book for you. ..."
"... As Duggan makes clear, Rand's influence is not just that she offered a programmatic for unregulated capitalism, but that she offered an emotional template for "optimistic cruelty" that has extended far beyond its libertarian confines. Mean Girl is a fun, worthwhile read! ..."
"... Her work circulated endlessly in those circles of the Goldwater-ite right. I have changed over many years, and my own life experiences have led me to reject the casual cruelty and vicious supremacist bent of Rand's beliefs. ..."
"... In fact, though her views are deeply-seated, Rand is, at heart, a confidence artist, appealing only to narrow self-interest at the expense of the well-being of whole societies. ..."
Jun 14, 2019 | www.amazon.com

From the Introduction

... ... ...

Mean Girls, which was based on interviews with high school girls conducted by Rosalind Wiseman for her 2002 book Queen Bees and War/tubes, reflects the emotional atmosphere of the age of the Plastics (as the most popular girls at Actional North Shore High are called), as well as the era of Wall Street's Gordon Gekko, whose motto is “Greed is Good.”1 The culture of greed is the hallmark of the neoliberal era, the period beginning in the 1970s when the protections of the U.S. and European welfare states, and the autonomy of postcolonial states around the world, came under attack. Advocates of neoliberalism worked to reshape global capitalism by freeing transnational corporations from restrictive forms of state regulation, stripping away government efforts to redistribute wealth and provide public services, and emphasizing individual responsibility over social concern.

From the 1980s to 2008, neoliberal politics and policies succeeded in expanding inequality around the world. The political climate Ayn Rand celebrated—the reign of brutal capitalism—intensified. Though Ayn Rand’s popularity took off in the 1940s, her reputation took a dive during the 1960s and ’70s. Then after her death in 1982, during the neoliberal administrations of Ronald Reagan in the United States and Margaret Thatcher in the United Kingdom, her star rose once more. (See chapter 4 for a full discussion of the rise of neoliberalism.)

During the global economic crisis of 2008 it seemed that the neoliberal order might collapse. It lived on, however, in zombie form as discredited political policies and financial practices were restored. But neoliberal capitalism has always been contested, and competing and conflicting political ideas and organizations proliferated and intensified after 2008 as well.

Protest politics blossomed on the left with Occupy Wall Street, Black Lives Matter, and opposition to the Dakota Access oil pipeline at the Standing Rock Sioux reservation in the United States, and with the Arab Spring, and other mobilizations around the world. Anti-neoliberal electoral efforts, like the Bernie Sanders campaign for the U.S. presidency, generated excitement as well.

But protest and organizing also expanded on the political right, with reactionary populist, racial nationalist, and protofascist gains in such countries as India, the Philippines, Russia, Hungary, and the United States rapidly proliferating. Between these far-right formations on the one side and persistent zombie neoliberalism on the other, operating sometimes at odds and sometimes in cahoots, the Season of Mean is truly upon us.

We are in the midst of a major global, political, economic, social, and cultural transition — but we don’t yet know which way we’re headed. The incoherence of the Trump administration is symptomatic of the confusion as politicians and business elites jockey with the Breitbart alt-right forces while conservative evangelical Christians pull strings. The unifying threads are meanness and greed, and the spirit of the whole hodgepodge is Ayn Rand.

Rand’s ideas are not the key to her influence. Her writing does support the corrosive capitalism at the heart of neoliberalism, though few movers and shakers actually read any of her nonfiction. Her two blockbuster novels, 'The Fountainpen and Atlas Shrugged, are at the heart of her incalculable impact. Many politicians and government officials going back decades have cited Rand as a formative influence—particularly finance guru and former Federal Reserve chairman Alan Greenspan, who was a member of Rand's inner circle, and Ronald Reagan, the U.S. president most identified with the national embrace of neoliberal policies.

Major figures in business and finance are or have been Rand fans: Jimmy Wales (Wikipedia), Peter Thiel (Paypal), Steve Jobs (Apple), John Mackey (Whole Foods), Mark Cuban (NBA), John Allison (BB&T Banking Corporation), Travis Kalanik (Uber), Jelf Bezos (Amazon), ad infinitum.

There are also large clusters of enthusiasts for Rand’s novels in the entertainment industry, from the 1940s to the present—from Barbara Stanwyck, Joan Crawford, and Raquel Welch to Jerry Lewis, Brad Pitt, Angelina Jolie, Rob Lowe, Jim Carrey, Sandra Bullock, Sharon Stone, Ashley Judd, Eva Mendes, and many more.

The current Trump administration is stuffed to the gills with Rand acolytes. Trump himself identifies with Fountainhead character Howard Roark; former secretary of state Rex Tillerson listed Adas Shrugged as his favorite book in a Scouting magazine feature; his replacement Mike Pompeo has been inspired by Rand since his youth. Ayn Rand’s influence is ascendant across broad swaths of our dominant political culture — including among public figures who see her as a key to the Zeitgeist, without having read a worth of her writing.’’

But beyond the famous or powerful fans, the novels have had a wide popular impact as bestsellers since publication. Along with Rand’s nonfiction, they form the core texts for a political/ philosophical movement: Objectivism. There are several U.S.- based Objectivist organizations and innumerable clubs, reading groups, and social circles. A 1991 survey by the Library of Congress and the Book of the Month Club found that only the Bible had influenced readers more than Atlas Shrugged, while a 1998 Modern Library poll listed The Fountainhead and Atlas Shrugged as the two most revered novels in English.

Atlas Shrugged in particular skyrocketed in popularity in the wake of the 2008 financial crash. The U.S. Tea Party movement, founded in 2009, featured numerous Ayn Rand—based signs and slogans, especially the opening line of Atlas Shrugged: “Who is John Galt?” Republican pundit David Frum claimed that the Tea Party was reinventing the GOP as “the party of Ayn Rand.” During 2009 as well, sales of Atlas Shrugged tripled, and GQ_magazine called Rand the year’s most influential author. A 2010 Zogby poll found that 29 percent of respondents had read Atlas Shrugged, and half of those readers said it had affected their political and ethical thinking.

In 2018, a business school teacher writing in Forbes magazine recommended repeat readings: “Recent events — the bizarro circus that is the 2016 election, the disintegration of Venezuela, and so on make me wonder if a lot of this could have been avoided bad we taken Atlas Shrugged's message to heart. It is a book that is worth re-reading every few years.”3

Rand biographer Jennifer Burns asserts simply that Ayn Rand's fiction is “the gateway drug” to right-wing politics in the United States — although her influence extends well beyond the right wing.4

But how can the work of this one novelist (also an essayist, playwright, and philosopher), however influential, be a significant source of insight into the rise of a culture of greed? In a word: sex. Ayn Rand made acquisitive capitalists sexy. She launched thousands of teenage libidos into the world of reactionary politics on a wave of quivering excitement. This sexiness extends beyond romance to infuse the creative aspirations, inventiveness, and determination of her heroes with erotic energy, embedded in what Rand called her “sense of life.” Analogous to what Raymond Williams has called a “structure of feeling,” Rand’s sense of life combines the libido-infused desire for heroic individual achievement with contempt for social inferiors and indifference to their plight.5

Lauren Berlant has called the structure of feeling, or emotional situation, of those who struggle for a good life under neoliberal conditions “cruel optimism”—the complex of feelings necessary to keep plugging away hopefully despite setbacks and losses.'’ Rand's contrasting sense of life applies to those whose fantasies of success and domination include no doubt or guilt. The feelings of aspiration and glee that enliven Rand’s novels combine with contempt for and indifference to others. The resulting Randian sense of life might be called “optimistic cruelty.” Optimistic cruelty is the sense of life for the age of greed.

Ayn Rand’s optimistic cruelty appeals broadly and deeply through its circulation of familiar narratives: the story of “civilizational” progress, die belief in American exceptionalism, and a commitment to capitalist freedom.

Her novels engage fantasies of European imperial domination conceived as technological and cultural advancement, rather than as violent conquest. America is imagined as a clean slate for pure capitalist freedom, with no indigenous people, no slaves, no exploited immigrants or workers in sight. The Fountainhead and especially Atlas Shrugged fabricate history and romanticize violence and domination in ways that reflect, reshape, and reproduce narratives of European superiority' and American virtue.

Their logic also depends on a hierarchy of value based on radicalized beauty and physical capacity — perceived ugliness or disability' are equated with pronounced worthlessness and incompetence.

Through the forms of romance and melodrama, Rand novels extrapolate the story of racial capitalism as a story of righteous passion and noble virtue. They retell The Birth of a Ntation through the lens of industrial capitalism (see chapter 2). They solicit positive identification with winners, with dominant historical forces. It is not an accident that the novels’ fans, though gender mixed, are overwhelmingly white Americans of the professional, managerial, creative, and business classes."


aslan , June 1, 2019

devastating account of the ethos that shapes contemporary America

Ayn Rand is a singular influence on American political thought, and this book brilliantly unfolds how Rand gave voice to the ethos that shapes contemporary conservatism. Duggan -- whose equally insightful earlier book Twilight of Equality offered an analysis of neoliberalism and showed how it is both a distortion and continuation of classical liberalism -- here extends the analysis of American market mania by showing how an anti-welfare state ethos took root as a "structure of feeling" in American culture, elevating the individual over the collective and promoting a culture of inequality as itself a moral virtue.

Although reviled by the right-wing press (she should wear this as a badge of honor), Duggan is the most astute guide one could hope for through this devastating history of our recent past, and the book helps explain how we ended up where we are, where far-right, racist nationalism colludes (paradoxically) with libertarianism, an ideology of extreme individualism and (unlikely bed fellows, one might have thought) Silicon Valley entrepreneurship.

This short, accessible book is essential reading for everyone who wants to understand the contemporary United States.

Wreck2 , June 1, 2019
contemporary cruelty

Does the pervasive cruelty of today's ruling classes shock you? Or, at least give you pause from time to time? Are you surprised by the fact that our elected leaders seem to despise people who struggle, people whose lives are not cushioned and shaped by inherited wealth, people who must work hard at many jobs in order to scrape by? If these or any of a number of other questions about the social proclivities of our contemporary ruling class detain you for just two seconds, this is the book for you.

Writing with wit, rigor, and vigor, Lisa Duggan explains how Ayn Rand, the "mean girl," has captured the minds and snatched the bodies of so very many, and has rendered them immune to feelings of shared humanity with those whose fortunes are not as rosy as their own. An indispensable work, a short read that leaves a long memory.

kerwynk , June 2, 2019
Valuable and insightful commentary on Rand and Rand's influence on today's world

Mean Girl offers not only a biographical account of Rand (including the fact that she modeled one of her key heroes on a serial killer), but describes Rand's influence on neoliberal thinking more generally.

As Duggan makes clear, Rand's influence is not just that she offered a programmatic for unregulated capitalism, but that she offered an emotional template for "optimistic cruelty" that has extended far beyond its libertarian confines. Mean Girl is a fun, worthwhile read!

Sister, June 3, 2019

Superb poitical and cultural exploration of Rand's influence

Lisa Duggan's concise but substantive look at the political and cultural influence of Ayn Rand is stunning. I feel like I've been waiting most of a lifetime for a book that is as wonderfully readable as it is insightful. Many who write about Rand reduce her to a caricature hero or demon without taking her, and the history and choices that produced her seriously as a subject of cultural inquiry. I am one of those people who first encountered Rand's books - novels, but also some nonfiction and her play, "The Night of January 16th," in which audience members were selected as jurors – as a teenager.

Under the thrall of some right-wing locals, I was so drawn to Rand's larger-than-life themes, the crude polarization of "individualism" and "conformity," the admonition to selfishness as a moral virtue, her reductive dismissal of the public good as "collectivism."

Her work circulated endlessly in those circles of the Goldwater-ite right. I have changed over many years, and my own life experiences have led me to reject the casual cruelty and vicious supremacist bent of Rand's beliefs.

But over those many years, the coterie of Rand true believers has kept the faith and expanded. One of the things I value about Duggan's compelling account is her willingness to take seriously the far reach of Rand's indifference to human suffering even as she strips away the veneer that suggests Rand's beliefs were deep.

In fact, though her views are deeply-seated, Rand is, at heart, a confidence artist, appealing only to narrow self-interest at the expense of the well-being of whole societies.

I learned that the hard way, but I learned it. Now I am recommending Duggan's wise book to others who seek to understand today's cultural and political moment in the United States and the rise of an ethic of indifference to anybody but the already affluent. Duggan is comfortable with complexity; most Randian champions or detractors are not.

[Jun 11, 2019] How to Hide an Empire: A History of the Greater United States by Daniel Immerwahr

Notable quotes:
"... No other book out there has the level of breadth on the history of US imperialism that this work provides. Even though it packs 400 pages of text (which might seem like a turnoff for non-academic readers), "How to Hide an Empire" is highly readable given Immerwhar's skills as a writer. Also, its length is part of what makes it awesome because it gives it the right amount of detail and scope. ..."
"... Alleging that US imperialism in its long evolution (which this book deciphers with poignancy) has had no bearing on the destinies of its once conquered populations is as fallacious as saying that the US is to blame for every single thing that happens in Native American communities, or in the Philippines, Puerto Rico, Guam, American Samoa, etc. Not everything that happens in these locations and among these populations is directly connected to US expansionism, but a great deal is. ..."
"... This is exactly the kind of book that drives the "My country, right or wrong" crowd crazy. Yes, slavery and genocide and ghastly scientific experiments existed before Europeans colonized the Americas, but it's also fair and accurate to say that Europeans made those forms of destruction into a bloody artform. Nobody did mass slaughter better. ..."
Feb 19, 2019 | www.amazon.com
4.6 out of 5 stars 50 customer reviews Reviews

Jose I. Fuste, February 25, 2019

5.0 out of 5 stars Comprehensive yet highly readable. A necessary and highly useful update.

I'm a professor at the University of California San Diego and I'm assigning this for a graduate class.

No other book out there has the level of breadth on the history of US imperialism that this work provides. Even though it packs 400 pages of text (which might seem like a turnoff for non-academic readers), "How to Hide an Empire" is highly readable given Immerwhar's skills as a writer. Also, its length is part of what makes it awesome because it gives it the right amount of detail and scope.

I could not disagree more with the person who gave this book one star. Take it from me: I've taught hundreds of college students who graduate among the best in their high school classes and they know close to nothing about the history of US settler colonialism, overseas imperialism, or US interventionism around the world. If you give University of California college students a quiz on where the US' overseas territories are, most who take it will fail (trust me, I've done it). And this is not their fault. Instead, it's a product of the US education system that fails to give students a nuanced and geographically comprehensive understanding of the oversized effect that their country has around our planet.

Alleging that US imperialism in its long evolution (which this book deciphers with poignancy) has had no bearing on the destinies of its once conquered populations is as fallacious as saying that the US is to blame for every single thing that happens in Native American communities, or in the Philippines, Puerto Rico, Guam, American Samoa, etc. Not everything that happens in these locations and among these populations is directly connected to US expansionism, but a great deal is.

A case in point is Puerto Rico's current fiscal and economic crisis. The island's political class share part of the blame for Puerto Rico's present rut. A lot of it is also due to unnatural (i.e. "natural" but human-exacerbated) disasters such as Hurricane María. However, there is no denying that the evolution of Puerto Rico's territorial status has generated a host of adverse economic conditions that US states (including an island state such as Hawaii) do not have to contend with. An association with the US has undoubtedly raised the floor of material conditions in these places, but it has also imposed an unjust glass ceiling that most people around the US either do not know about or continue to ignore.

To add to those unfair economic limitations, there are political injustices regarding the lack of representation in Congress, and in the case of Am. Samoa, their lack of US citizenship. The fact that the populations in the overseas territories can't make up their mind about what status they prefer is: a) understandable given the way they have been mistreated by the US government, and b) irrelevant because what really matters is what Congress decides to do with the US' far-flung colonies, and there is no indication that Congress wants to either fully annex them or let them go because neither would be convenient to the 50 states and the political parties that run them. Instead, the status quo of modern colonial indeterminacy is what works best for the most potent political and economic groups in the US mainland. Would

This book is about much more than that though. It's also a history of how and why the United States got to control so much of what happens around the world without creating additional formal colonies like the "territories" that exist in this legal limbo. Part of its goal is to show how precisely how US imperialism has been made to be more cost-effective and also more invisible.

Read Immerwhar's book, and don't listen to the apologists of US imperialism which is still an active force that contradicts the US' professed values and that needs to be actively dismantled. Their attempts at discrediting this important reflect a denialism of the US' imperial realities that has endured throughout the history that this book summarizes.

"How to Hide an Empire: A History of the Greater United States" is a great starting point for making the US public aware of the US' contradictions as an "empire of liberty" (a phrase once used by Thomas Jefferson to describe the US as it expanded westward beyond the original 13 colonies). It is also a necessary update to other books on this topic that are already out there, and it is likely to hold the reader's attention more given its crafty narrative prose and structure Read less 194 people found this helpful Helpful Comment Report abuse

David Robson, February 26, 2019
Why So Sensitive?

5.0 out of 5 stars Why So Sensitive?

This is exactly the kind of book that drives the "My country, right or wrong" crowd crazy. Yes, slavery and genocide and ghastly scientific experiments existed before Europeans colonized the Americas, but it's also fair and accurate to say that Europeans made those forms of destruction into a bloody artform. Nobody did mass slaughter better.

The author of this compelling book reveals a history unknown to many readers, and does so with first-hand accounts and deep historical analyses. You might ask why we can't put such things behind us. The simple answer: we've never fully grappled with these events before in an honest and open way. This book does the nation a service by peering behind the curtain and facing the sobering truth of how we came to be what we are.

Thomas W. Moloney, April 9, 2019
This is a stunning book, not to be missed.

5.0 out of 5 stars This is a stunning book, not to be missed.

This is a stunning book, not to be missed. If you finished Sapiens with the feeling your world view had greatly enlarged, you're likely to have the same experience of your view of the US from reading this engaging work. And like Sapiens, it's an entirely enjoyable read, full of delightful surprises, future dinner party gems.

The further you get into the book the more interesting and unexpected it becomes. You'll look at the US in ways you likely never considered before. This is not a 'political' book with an ax to grind or a single-party agenda. It's refreshingly insightful, beautifully written, fun to read.

This is a gift I'll give to many a good friend, I've just started with my wife. I rarely write reviews and have never met the author (now my only regret). 3 people found this helpful

P , May 17, 2019
Content is A+. Never gets boring/tedious; never lingers; well written. It is perfect. 10/10

4.0 out of 5 stars Content is A+. Never gets boring/tedious; never lingers; well written. It is perfect. 10/10

This book is an absolutely powerhouse, a must-read, and should be a part of every student's curriculum in this God forsaken country.

Strictly speaking, this brilliant read is focused on America's relationship with Empire. But like with nearly everything America, one cannot discuss it without discussing race and injustice.

If you read this book, you will learn a lot of new things about subjects that you thought you knew everything about. You will have your eyes opened. You will be exposed to the dark underbelly of racism, corruption, greed and exploitation that undergird American ambition.

I don't know exactly what else to say other than to say you MUST READ THIS BOOK. This isn't a partisan statement -- it's not like Democrats are any better than Republicans in this book.

This is one of the best books I've ever read, and I am a voracious reader. The content is A+. It never gets boring. It never gets tedious. It never lingers on narratives. It's extremely well written. It is, in short, perfect. And as such, 10/10.

Sunny May 11, 2019
Excellent and thoughtful discussion regarding the state of our union

5.0 out of 5 stars Excellent and thoughtful discussion regarding the state of our union

I heard an interview of Daniel Immerwahr on NPR news / WDET radio regarding this book.

I'm am quite conservative and only listen to NPR news when it doesn't lean too far to the left.

However, the interview piqued my interest. I am so glad I purchased this ebook. What a phenomenal and informative read!!! WOW!! It's a "I never knew that" kind of read. Certainly not anything I was taught in school. This is thoughtful, well written and an easy read. Highly recommend!!

[Jun 11, 2019] Globalists: The End of Empire and the Birth of Neoliberalism by Quinn Slobodian

The author is a very fuzzy way comes to the idea that neoliberalism is in essence a Trotskyism for the rich and that neoliberals want to use strong state to enforce the type of markets they want from above. That included free movement of capital goods and people across national borders. All this talk about "small government" is just a smoke screen for naive fools.
Similar to 1930th contemporary right-wing populism in Germany and Austria emerged from within neoliberalism, not in opposition to it. They essentially convert neoliberalism in "national liberalism": Yes to free trade by only on bilateral basis with a strict control of trade deficits. No to free migration, multilateralism
Notable quotes:
"... The second explanation was that neoliberal globalization made a small number of people very rich, and it was in the interest of those people to promote a self-serving ideology using their substantial means by funding think tanks and academic departments, lobbying congress, fighting what the Heritage Foundation calls "the war of ideas." Neoliberalism, then, was a restoration of class power after the odd, anomalous interval of the mid-century welfare state. ..."
"... Neoliberal globalism can be thought of in its own terms as a negative theology, contending that the world economy is sublime and ineffable with a small number of people having special insight and ability to craft institutions that will, as I put it, encase the sublime world economy. ..."
"... One of the big goals of my book is to show neoliberalism is one form of regulation among many rather than the big Other of regulation as such. ..."
"... I build here on the work of other historians and show how the demands in the United Nations by African, Asian, and Latin American nations for things like the Permanent Sovereignty over Natural Resources, i.e. the right to nationalize foreign-owned companies, often dismissed as merely rhetorical, were actually existentially frightening to global businesspeople. ..."
"... They drafted neoliberal intellectuals to do things like craft agreements that gave foreign corporations more rights than domestic actors and tried to figure out how to lock in what I call the "human right of capital flight" into binding international codes. I show how we can see the development of the WTO as largely a response to the fear of a planned -- and equal -- planet that many saw in the aspirations of the decolonizing world. ..."
"... The neoliberal insight of the 1930s was that the market would not take care of itself: what Wilhelm Röpke called a market police was an ongoing need in a world where people, whether out of atavistic drives or admirable humanitarian motives, kept trying to make the earth a more equal and just place. ..."
"... The culmination of these processes by the 1990s is a world economy that is less like a laissez-faire marketplace and more like a fortress, as ever more of the world's resources and ideas are regulated through transnational legal instruments. ..."
Mar 16, 2018 | www.amazon.com

Hardcover: 400 pages
Publisher: Harvard University Press (March 16, 2018)
Language: English
ISBN-10: 0674979524
ISBN-13: 978-0674979529

From introduction

...The second explanation was that neoliberal globalization made a small number of people very rich, and it was in the interest of those people to promote a self-serving ideology using their substantial means by funding think tanks and academic departments, lobbying congress, fighting what the Heritage Foundation calls "the war of ideas." Neoliberalism, then, was a restoration of class power after the odd, anomalous interval of the mid-century welfare state.

There is truth to both of these explanations. Both presuppose a kind of materialist explanation of history with which I have no problem. In my book, though, I take another approach. What I found is that we could not understand the inner logic of something like the WTO without considering the whole history of the twentieth century. What I also discovered is that some of the members of the neoliberal movement from the 1930s onward, including Friedrich Hayek and Ludwig von Mises, did not use either of the explanations I just mentioned. They actually didn't say that economic growth excuses everything. One of the peculiar things about Hayek, in particular, is that he didn't believe in using aggregates like GDP -- the very measurements that we need to even say what growth is.

What I found is that neoliberalism as a philosophy is less a doctrine of economics than a doctrine of ordering -- of creating the institutions that provide for the reproduction of the totality [of financial elite control of the state]. At the core of the strain I describe is not the idea that we can quantify, count, price, buy and sell every last aspect of human existence. Actually, here it gets quite mystical. The Austrian and German School of neoliberals in particular believe in a kind of invisible world economy that cannot be captured in numbers and figures but always escapes human comprehension.

After all, if you can see something, you can plan it. Because of the very limits to our knowledge, we have to default to ironclad rules and not try to pursue something as radical as social justice, redistribution, or collective transformation. In a globalized world, we must give ourselves over to the forces of the market, or the whole thing will stop working.

So this is quite a different version of neoliberal thought than the one we usually have, premised on the abstract of individual liberty or the freedom to choose. Here one is free to choose but only within a limited range of options left after responding to the global forces of the market.

One of the core arguments of my book is that we can only understand the internal coherence of neoliberalism if we see it as a doctrine as concerned with the whole as the individual. Neoliberal globalism can be thought of in its own terms as a negative theology, contending that the world economy is sublime and ineffable with a small number of people having special insight and ability to craft institutions that will, as I put it, encase the sublime world economy.

To me, the metaphor of encasement makes much more sense than the usual idea of markets set free, liberated or unfettered. How can it be that in an era of proliferating third party arbitration courts, international investment law, trade treaties and regulation that we talk about "unfettered markets"? One of the big goals of my book is to show neoliberalism is one form of regulation among many rather than the big Other of regulation as such.

What I explore in Globalists is how we can think of the WTO as the latest in a long series of institutional fixes proposed for the problem of emergent nationalism and what neoliberals see as the confusion between sovereignty -- ruling a country -- and ownership -- owning the property within it.

I build here on the work of other historians and show how the demands in the United Nations by African, Asian, and Latin American nations for things like the Permanent Sovereignty over Natural Resources, i.e. the right to nationalize foreign-owned companies, often dismissed as merely rhetorical, were actually existentially frightening to global businesspeople.

They drafted neoliberal intellectuals to do things like craft agreements that gave foreign corporations more rights than domestic actors and tried to figure out how to lock in what I call the "human right of capital flight" into binding international codes. I show how we can see the development of the WTO as largely a response to the fear of a planned -- and equal -- planet that many saw in the aspirations of the decolonizing world.

Perhaps the lasting image of globalization that the book leaves is that world capitalism has produced a doubled world -- a world of imperium (the world of states) and a world of dominium (the world of property). The best way to understand neoliberal globalism as a project is that it sees its task as the never-ending maintenance of this division. The neoliberal insight of the 1930s was that the market would not take care of itself: what Wilhelm Röpke called a market police was an ongoing need in a world where people, whether out of atavistic drives or admirable humanitarian motives, kept trying to make the earth a more equal and just place.

The culmination of these processes by the 1990s is a world economy that is less like a laissez-faire marketplace and more like a fortress, as ever more of the world's resources and ideas are regulated through transnational legal instruments. The book acts as a kind of field guide to these institutions and, in the process, hopefully recasts the 20th century that produced them.


Mark bennett

One half of a decent book

3.0 out of 5 stars One half of a decent book May 14, 2018 Format: Hardcover Verified Purchase This is a rather interesting look at the political and economic ideas of a circle of important economists, including Hayek and von Mises, over the course of the last century. He shows rather convincingly that conventional narratives concerning their idea are wrong. That they didn't believe in a weak state, didn't believe in the laissez-faire capitalism or believe in the power of the market. That they saw mass democracy as a threat to vested economic interests.

The core beliefs of these people was in a world where money, labor and products could flow across borders without any limit. Their vision was to remove these subjects (tariffs, immigration and controls on the movement of money) from the control of the democracy-based nation-state and instead vesting them in international organizations. International organizations which were by their nature undemocratic and beyond the influence of democracy. That rather than rejecting government power, what they rejected was national government power. They wanted weak national governments but at the same time strong undemocratic international organizations which would gain the powers taken from the state.

The other thing that characterized many of these people was a rather general rejection of economics. While some of them are (at least in theory) economists, they rejected the basic ideas of economic analysis and economic policy. The economy, to them, was a mystical thing beyond any human understanding or ability to influence in a positive way. Their only real belief was in "bigness". The larger the market for labor and goods, the more economically prosperous everyone would become. A unregulated "global" market with specialization across borders and free migration of labor being the ultimate system.

The author shows how, over a period extending from the 1920s to the 1990s, these ideas evolved from marginal academic ideas to being dominant ideas internationally. Ideas that are reflected today in the structure of the European Union, the WTO (World Trade Organization) and the policies of most national governments. These ideas, which the author calls "neoliberalism", have today become almost assumptions beyond challenge. And even more strangely, the dominating ideas of the political left in most of the west.

The author makes the point, though in a weak way, that the "fathers" of neoliberalism saw themselves as "restoring" a lost golden age. That golden age being (roughly) the age of the original industrial revolution (the second half of the 1800s). And to the extent that they have been successful they have done that. But at the same time, they have brought back all the political and economic questions of that era as well.

In reading it, I started to wonder about the differences between modern neoliberalism and the liberal political movement during the industrial revolution. I really began to wonder about the actual motives of "reform" liberals in that era. Were they genuinely interested in reforms during that era or were all the reforms just cynical politics designed to enhance business power at the expense of other vested interests. Was, in particular, the liberal interest in political reform and franchise expansion a genuine move toward political democracy or simply a temporary ploy to increase their political power. If one assumes that the true principles of classic liberalism were always free trade, free migration of labor and removing the power to governments to impact business, perhaps its collapse around the time of the first world war is easier to understand.

He also makes a good point about the EEC and the organizations that came before the EU. Those organizations were as much about protecting trade between Europe and former European colonial possessions as they were anything to do with trade within Europe.

To me at least, the analysis of the author was rather original. In particular, he did an excellent job of showing how the ideas of Hayek and von Mises have been distorted and misunderstood in the mainstream. He was able to show what their ideas were and how they relate to contemporary problems of government and democracy.

But there are some strong negatives in the book. The author offers up a complete virtue signaling chapter to prove how the neoliberals are racists. He brings up things, like the John Birch Society, that have nothing to do with the book. He unleashes a whole lot of venom directed at American conservatives and republicans mostly set against a 1960s backdrop. He does all this in a bad purpose: to claim that the Kennedy Administration was somehow a continuation of the new deal rather than a step toward neoliberalism. His blindness and modern political partisanship extended backward into history does substantial damage to his argument in the book. He also spends an inordinate amount of time on the political issues of South Africa which also adds nothing to the argument of the book. His whole chapter on racism is an elaborate strawman all held together by Ropke. He also spends a large amount of time grinding some sort of Ax with regard to the National Review and William F. Buckley.

He keeps resorting to the simple formula of finding something racist said or written by Ropke....and then inferring that anyone who quoted or had anything to do with Ropke shared his ideas and was also a racist. The whole point of the exercise seems to be to avoid any analysis of how the democratic party (and the political left) drifted over the decades from the politics of the New Deal to neoliberal Clintonism.

Then after that, he diverts further off the path by spending many pages on the greatness of the "global south", the G77 and the New International Economic Order (NIEO) promoted by the UN in the 1970s. And whatever many faults of neoliberalism, Quinn Slobodian ends up standing for a worse set of ideas: International Price controls, economic "reparations", nationalization, international trade subsidies and a five-year plan for the world (socialist style economic planning at a global level). In attaching himself to these particular ideas, he kills his own book. The premise of the book and his argument was very strong at first. But by around p. 220, its become a throwback political tract in favor of the garbage economic and political ideas of the so-called third world circa 1974 complete with 70's style extensive quotations from "Senegalese jurists"

Once the political agenda comes out, he just can't help himself. He opens the conclusion to the book taking another cheap shot for no clear reason at William F. Buckley. He spends alot of time on the Seattle anti-WTO protests from the 1990s. But he has NOTHING to say about BIll Clinton or Tony Blair or EU expansion or Obama or even the 2008 economic crisis for that matter. Inexplicably for a book written in 2018, the content of the book seems to end in the year 2000.

I'm giving it three stars for the first 150 pages which was decent work. The second half rates zero stars. Though it could have been far better if he had written his history of neoliberalism in the context of the counter-narrative of Keynesian economics and its decline. It would have been better yet if the author had the courage to talk about the transformation of the parties of the left and their complicity in the rise of neoliberalism. The author also tends to waste lots of pages repeating himself or worse telling you what he is going to say next. One would have expected a better standard of editing by the Harvard Press. Read less 69 people found this helpful Helpful Comment Report abuse

Jesper Doepping
A concise definition of neoliberalism and its historical influence

5.0 out of 5 stars A concise definition of neoliberalism and its historical influence November 14, 2018 Format: Kindle Edition Verified Purchase Anybody interested in global trade, business, human rights or democracy today should read this book.

The book follow the Austrians from the beginning in the Habsburgischer empire to the beginning rebellion against the WTO. However, most importantly it follows the thinking and the thoughts behind the building of a global empire of capitalism with free trade, capital and rights. All the way to the new "human right" to trade. It narrows down what neoliberal thought really consist of and indirectly make a differentiation to the neoclassical economic tradition.

What I found most interesting is the turn from economics to law - and the conceptual distinctions between the genes, tradition, reason, which are translated into a quest for a rational and reason based protection of dominium (the rule of property) against the overreach of imperium (the rule of states/people). This distinction speaks directly to the issues that EU is currently facing.

Jackal
A historian with an agenda

3.0 out of 5 stars A historian with an agenda October 22, 2018 Format: Hardcover Author is covering Mises, Hayek, Machlup in Vienna. How to produce order once the Habsburg empire had been broken after 1918? They pioneered data gathering about the economy. However, such data came to be used by the left as well. This forced the people mentioned to become intellectual thinkers as opposed to something else(??). I like how the author is situating the people in a specific era, but he is reading history backwards. The book moves on, but stays in Central Europe. Ordocapitalism followed after Hitler. It was a German attempt to have a both strong state and strong by market, which given Europe's fragmentation required international treaties. This was seen as a way to avoid another Hitler. Later, international organisations like IMF and TWO became the new institutions that embedded the global markets. The book ends in the 90s. So in reading history backwards, the author finds quotations of Mises and Hayek that "prove" that they were aiming to create intellectual cover for the global financial elite of the 2010s.

Nevertheless, the book is interesting if you like the history of ideas. He frames the questions intelligently in the historical context at the time. However a huge question-mark for objectivity. The book is full of lefty dog whistles: the war making state, regulation of capitalism, reproducing the power of elites, the problem [singular] of capitalism. In a podcast the author states point blank "I wanted the left to see what the enemy was up too". I find it pathetic that authors are so blatantly partisan. How can we know whether he is objective when he doesn't even try? He dismissively claims that the neoliberal thinkers gave cover to what has become the globalist world order. So why should we not consider the current book as intellectual cover for some "new left" that is about to materialise? Maybe the book is just intellectual cover for the globalist elite being educated in left-wing private colleges.

[Jun 11, 2019] American Exceptionalism and American Innocence A People's History of Fake News_From the Revolutionary War to the War on Terror

Jun 11, 2019 | www.amazon.com

Did the U.S. really "save the world" in World War II? Should black athletes stop protesting and show more gratitude for what America has done for them? Are wars fought to spread freedom and democracy? Or is this all fake news?

American Exceptionalism and American Innocence examines the stories we're told that lead us to think that the U.S. is a force for good in the world, regardless of slavery, the genocide of indigenous people, and the more than a century's worth of imperialist war that the U.S. has wrought on the planet.

Sirvent and Haiphong detail just what Captain America's shield tells us about the pretensions of U.S. foreign policy, how Angelina Jolie and Bill Gates engage in humanitarian imperialism, and why the Broadway musical Hamilton is a monument to white supremacy.

====

Like a thunderbolt that penetrates the dark fog of ideological confusion, American Exceptionalism and American Innocence: A People's History of Fake News -- From the Revolutionary War to the War on Terror , illuminates the hidden spaces of the official story of the territory that came to be known as the United States of America.

Meticulously researched, American Exceptionalism and American Innocence utilizes a de-colonial lens that debunks the distorted, mythological liberal framework that rationalized the U.S. settler-colonial project. The de-colonized frame allows them to critically root their analysis in the psychosocial history, culture, political economy, and evolving institutions of the United States of America without falling prey to the unrecognized and unacknowledged liberalism and national chauvinism that seeps through so much of what is advanced as radical analysis today.

That is what makes this work so "exceptional" and so valuable at this moment of institutional and ideological crisis in the U.S. This crisis is indeed more severe and potentially more transformative than at any other moment in this nation's history.

With unflinching clarity, Sirvent and Haiphong go right to the heart of the current social, political, economic, and ideological crisis. They strip away the obscurantist nonsense pushed by liberal and state propagandists that the Trump phenomenon represents a fundamental departure from traditional "American values" by demonstrating that "Trumpism" is no departure at all, but only the unfiltered contemporary and particular expression of the core values that the nation was "founded" on.

What Sirvent and Haiphong expose in their work is that American exceptionalism and its corollary American innocence are the interconnected frames that not only explain why the crude white nationalism of a Donald Trump is consistent with the violence and white supremacy of the American experience, but also why that violence has been largely supported by large sections of the U.S. population repeatedly.

As the exceptional nation, the indispensable nation, the term President Obama liked to evoke to give humanitarian cover to the multiple interventions,

destabilization campaigns, and unilateral global policing operations on behalf of U.S. and international capital, it is expected and largely accepted by the citizens of the U.S. that their nation-state has a right and, actually, a moral duty to do whatever it deems appropriate to uphold the international order. It can do that because this cause is noble and righteous. Lest we forget the words of Theodore Roosevelt, considered a great architect of American progressiveness, "If given the choice between righteousness and peace, I choose righteousness."

In a succinct and penetrating observation, Sirvent and Haiphong point out:

American exceptionalism has always presumed national innocence despite imposing centuries of war and plunder. The American nation-state has been at war for over ninety percent of its existence. These wars have all been justified as necessary ventures meant to defend or expand America's so-called founding values and beliefs. A consequence of centuries of endless war has been the historical tendency of the U.S. to erase from consciousness the realities that surround American domestic and international policy, not to mention the system of imperialism that governs both.

But the acceptance of state violence in the form of economic sanctions and direct and indirect military interventions is not the only consequence of the cultural conditioning process informed by the arrogance of white privilege, white rights, and the protection of white Western civilization. The racist xenophobia, impunity for killer-cops, mass incarceration, ICE raids and checkpoints, left-right ideological convergence to erase "blackness," are all part of the racial management process that still enjoys majoritarian support in the U.S.

American Exceptionalism and American Innocence 's focus on the insidious and corrosive impact of white supremacy throughout the book is a necessary and valuable corrective to the growing tendency toward marginalizing the issue of race, even among left forces under the guise of being opposed to so-called identity politics.

Centering the role of white supremacist ideologies and its connection to American exceptionalism and innocence, Sirvent and Haiphong argue that "communities and activists will be better positioned to dismantle them." American exceptionalism and notions of U.S. innocence not only provide

ideological rationalizations for colonialism, capitalism, empire, and white supremacy, but also a normalized theoretical framework for how the world is and should be structured that inevitably makes criminals out of the people opposing U.S. dominance, within the nation and abroad.

Paul Krugman, a leading liberal within the context of the U.S. articulates this normalized framework that is shared across the ideological spectrum from liberal to conservative and even among some left forces. I have previously referred to this view of the world as representative of the psychopathology of white supremacy:

"We emerged from World War II with a level of both economic and military dominance not seen since the heyday of ancient Rome. But our role in the world was always about more than money and guns. It was also about ideals: America stood for something larger than itself -- for freedom, human rights and the rule of law as universal principles . . . By the end of World War II, we and our British allies had in effect conquered a large part of the world. We could have become permanent occupiers, and/or installed subservient puppet governments, the way the Soviet Union did in Eastern Europe. And yes, we did do that in some developing countries; our history with, say, Iran is not at all pretty. But what we mainly did instead was help defeated enemies get back on their feet, establishing democratic regimes that shared our core values and became allies in protecting those values. The Pax Americana was a sort of empire; certainly America was for a long time very much first among equals. But it was by historical standards a remarkably benign empire, held together by soft power and respect rather than force." 1

American Exceptionalism and American Innocence refutes this pathological view of the U.S. and demonstrates that this view is a luxury that the colonized peoples of the world cannot afford.

The bullet and the bomb -- the American military occupation and the police occupation -- are the bonds that link the condition of Black Americans to oppressed nations around the world. This is the urgency in which the authors approached their task. The physical and ideological war being waged against the victims of the colonial/capitalist white supremacist patriarchy is resulting in real suffering. Authentic solidarity with the oppressed requires a

rejection of obfuscation. The state intends to secure itself and the ruling elite by legal or illegal means, by manipulating or completely jettisoning human freedom and democratic rights. Sirvent and Haiphong know that time is running out. They demonstrate the intricate collaboration between the state and the corporate and financial elite to create the conditions in which ideological and political opposition would be rendered criminal as the state grapples with the legitimacy crisis it finds itself in. They know that Trump's "make America great again" is the Republican version of Obama's heralding of U.S. exceptionalism, and that both are laying the ideological foundation for a cross-class white neofascist solution to the crisis of neoliberal capitalism.

The U.S. is well on its way toward a new form of totalitarianism that is more widespread than the forms of neofascist rule that was the norm in the Southern states of the U.S. from 1878 to 1965. Chris Hedges refers to it as "corporate totalitarianism." And unlike the sheer social terror experienced by the African American population as a result of the corporatist alignment of the new Democratic party and national and regional capital in the South, this "new" form of totalitarianism is more benign but perhaps even more insidious because the control rests on the ability to control thought. And here lies the challenge. Marxist thinker Fredrick Jamison shares a very simple lesson, "The lesson is this, and it is a lesson about system: one cannot change anything without changing everything." This simple theory of system change argues that when you change one part of a system you by necessity must change all parts of the system, because all parts are interrelated.

The failure of the Western left in general and the U.S. left in particular to understand the inextricable, structural connection between empire, colonization, capitalism, and white supremacy -- and that all elements of that oppressive structure must be confronted, dismantled, and defeated -- continues to give lifeblood to a system that is ready to sweep into the dustbins of history. This is why American Exceptionalism and American Innocence is nothing more than an abject subversion. It destabilizes the hegemonic assumptions and imposed conceptual frameworks of bourgeois liberalism and points the reader toward the inevitable conclusion that U.S. society in its present form poses an existential threat to global humanity.

Challenging the reader to rethink the history of the U.S. and to imagine a future, decolonial nation in whatever form it might take, Sirvent and Haiphong include a quote from Indigenous rights supporter Andrea Smith

that captures both the subversive and optimistic character of their book. Smith is quoted saying:

Rather than a pursuit of life, liberty, and happiness that depends on the deaths of others . . . we can imagine new forms of governance based on the principles of mutuality, interdependence, and equality. When we do not presume that the United States should or will continue to exist, we can begin to imagine more than a kinder, gentler settler state founded on genocide and slavery.

American Exceptionalism and American Innocence gives us a weapon to reimagine a transformed U.S. nation, but it also surfaces the ideological minefields that we must avoid if we are to realize a new possibility and a new people.

<img src="https://images-na.ssl-images-amazon.com/images/S/amazon-avatars-global/default._CR0,0,1024,1024_SX48_.png"> John , May 26, 2019

Great Reading, But Some Omissions

I thought the book was great. However, key events were not discussed. One of the first deployed American expeditionary forces to bless the world was the establishment of treaty ports in China. These new American foreign beachheads in the Middle Kingdom came about as a result of Western ambitions to take them over as new colonial owners and led to one of the most ruinous periods in world history. Europe and the U.S. saturated the country with opium, leaving many Chinese stoned. This resulted in the destabilization of China, invasion of the brutal Japanese and the rise of Mao. Result- millions upon millions of people died because of American exceptionalism. It has taken China the last thirty years to recover from the disasters. Naturally, Trump & Co are not aware of this history or are unconcerned. However, the Chinese have not forgotten and routinely warn Team Trump they will not be bullied by foreigners again. Washington elites are ignorant at everyone's peril who want peace. Footnote - American exceptionalists Roosevelt, Kerry, Forbes, etc., got their wealth the old fashion way - by becoming drug kingpins to China.

The other big omission was World War I and especially its aftermath. Lauded by the French and saving European imperialism, returning African-American soldiers found themselves being singled out for extra harsh Jim Crow treatment -- they were too uppity & refused to follow old social norms. Several Black vets were tortured and hung while in uniform because they were bringing back a new message from European trenches - equal treatment. They were also exemplary in defending other Black citizens from White mob ambushes.

Had the authors covered the WWI aftermath, they would have also had to critique in greater detail the media. What they would have had to expose the media was never a friend to African-Americans, which holds to this day. The media was and is consistent with aligning with white elite interests. When Blacks rose up against bad treatment, the media always presented the white point of view. In fact, every white institution was engaged in this biased practice.

The Espionage Act also put a chill on labor organizing post WWI. Indeed, elites were quick to call any Black unrest as seditious and labelled some leaders such as W.E.B Dubois, Bolshevik inspired and should have been brought up on charges. This was the beginning of the linking of Black activism to the Kremlin, long before McCarthyism, COINTELPRO and Black Identity Extremist government labels.

[Jun 05, 2019] End of Discussion How the Left s Outrage Industry Shuts Down Debate, Manipulates Voters, and Makes America Less Free (and Fun)

Notable quotes:
"... This book covers our current inability to allow all voices to be heard. Key words like "racism " and "?-phobia" (add your preference) can and do end conversations before they begin ..."
"... Hate speech is now any speech about an idea that you disagree with. As we go down the road of drowning out some speech eventually no speech will be allowed. Finger pointers should think about the future, the future when they will be silenced. It's never wrong to listen to different point of view. That's called learning. ..."
"... A very clear and balanced portrait of the current political landscape where a "minority of one" can be supposedly damaged as a result of being exposed to "offensive" ideas. ..."
"... A well documented journey of the transformation from a time when people had vehement arguments into Orwell-Land where the damage one supposedly "suffers" simply from having to "hear" offensive words, allows this shrieking minority to not only silence those voices, but to destroy the lives of the people who have the gall to utter them. ..."
Aug 01, 2017 | www.amazon.com

Q Garcia , August 9, 2017

1984 is Here - Everybody's Brother is Watching

This book covers our current inability to allow all voices to be heard. Key words like "racism " and "?-phobia" (add your preference) can and do end conversations before they begin .

Hate speech is now any speech about an idea that you disagree with. As we go down the road of drowning out some speech eventually no speech will be allowed. Finger pointers should think about the future, the future when they will be silenced. It's never wrong to listen to different point of view. That's called learning.

.0 out of 5 stars A Professor's Review of the Outrage Circus (and the first non-Vine review :-)
Brumble Buffin , August 18, 2015
Tolerance gone astray

I became interested in this book after watching Megyn Kelly's interview with Benson (Google it), where he gave his thoughts on the SCOTUS decision to legalize same-sex marriage in all 50 states. He made a heartfelt and reasoned plea for tolerance and grace on BOTH sides. He hit it out of the park with this and set himself apart from some of his gay peers who are determined that tolerance is NOT a two-way street.

We are seeing a vindictive campaign of lawsuits and intimidation against Christian business people who choose not to provide flowers and cakes for same-sex weddings. The First Amendment says that Congress shall make no law prohibiting the free exercise of religion. Thumbing your nose at this core American freedom should alarm us all. Personally, I'm for traditional marriage and I think the better solution would be to give civil unions the same legal rights and obligations as marriage, but that's another discussion.

So what about the book? It exceeded my expectations. Ham and Benson are smart and articulate. Their ideas are clearly presented, supported by hard evidence and they are fair and balanced. The book is a pleasure to read - - unless you are a die-hard Lefty. In that case, it may anger you, but anger can be the first step to enlightenment.

Steve Bicker , August 1, 2015
A Well Documented Death of Debate

A very clear and balanced portrait of the current political landscape where a "minority of one" can be supposedly damaged as a result of being exposed to "offensive" ideas.

A well documented journey of the transformation from a time when people had vehement arguments into Orwell-Land where the damage one supposedly "suffers" simply from having to "hear" offensive words, allows this shrieking minority to not only silence those voices, but to destroy the lives of the people who have the gall to utter them.

The Left lays claim to being the "party of tolerance", unless you happen to "think outside THEIR box", which, to the Left is INtolerable and must not only be silenced, but exterminated... A great book!

[May 22, 2019] XINYUNG Fitness Tracker Smart Watch, Activity Tracker with Heart Rate Monitor, Waterproof Pedometer Watch with Slee

May 22, 2019 | www.amazon.com

Features:

Heart Rate Monitor

Sleep Monitor

IP67 Life Waterproof

Smart Notifications

Connected GPS

7 Sport Modes

3 Alarm clock

Sedentary reminder

Remote Camera Control

Custom Dial

Music Player Control

6 Brightness Level Adjustment

[May 21, 2019] Updated 2019 Version Fitness Tracker HR, Activity Trackers Health Exercise Watch with Heart Rate

May 21, 2019 | www.amazon.com

Comfortable band with a fairly robust app April 20, 2019 Color: Black Verified Purchase For a fitness tracker, this is fairly cheap and robust for the size and price. The tracker does basic readings of steps, pulse, blood pressure, sleep patterns, and physical activity. As far as I can tell, the tracker is fairly accurate in all of the above, however, given the limited routines for physical activity available, it might not be as accurate unless you stick to the regimens it readily provides. It's comfortable with a long battery life (7-8 days with light activity, 5-6 days if you're a particularly active person), and is fairly water-resistant to be worn in the shower.

The app itself (GloryFit) is a fairly robust app and helps portray meaningful metrics around activity and sleep patterns. Pairing and using the device is pretty easy - just be within range and select the device.

Much of the package leaves a bit more to be desired. Both the instructions provided and app are riddled with grammatical and spelling errors that might turn most people off, but if you can look past these, it's still a fairly good set. The setup with the main touchscreen button is a bit weird, as I'm used to more touchscreen interfaces instead of a singular button. In this case, the tracker uses the button to either cycle on short presses, or select on long presses. The last minor gripe is taking off the band to expose the charging port for the tracker, but the instructions have a fairly nice picture of how to do that.

Pros: comfortable; easy to use; low profile

Cons: directions/app provided are hard to read/interpret; not intuitive to use

Recommended Use: for those with a fairly sedentary lifestyle; monitoring of basic vitals

[May 13, 2019] Big Israel How Israel's Lobby Moves America by Grant F. Smith

Jewish lobby does not represent the views of the US Jewish population. They represent a small number of rich donors (concentration is just staggering) and of course Israeli government. Those organization are non-representative authoritarian bodies with a lot of functionaries serving for life or extremly long tenures.
Notable quotes:
"... One stunning example of this influence occurred recently. At one time during the nominating process for the Republican candidate for President in the current election, every single aspirant to the nomination made a pilgrimage to Las Vegas to kiss the money ring of Sheldon Adelson, whose only declared interests are Israel and online gambling. This is the same super-patriot Sheldon Adelson who wanted Mitt Romney to pardon Jonathan Pollard, should Romney become President with Adelson's financial backing. ..."
Feb 05, 2016 | www.amazon.com

The latest in the powerful series of titles written by Grant Smith. Highly recommended factual, documented and accessible data that should
be required reading for high school students as well as their parents.!

James Robinson , July 26, 2016

Would have been a tedious read for someone well acquainted with Israeli machinations

Superb compilation of organizations that receive tax exempt status in the US that work exclusively on behave of a foreign nation, Israel,often to the pronounced determent of the US interests and policies. Would have been a tedious read for someone well acquainted with Israeli machinations, but for someone new to the subject the anger that the revelations produce makes the reading of this book a disquietening breeze. Read more

Ronald Johnson , April 11, 2016
non-systematic conjecture about Zionism's amazing insider access to

Book Review of Big Israel, by Grant F. Smith

This is an important book, the latest from Grant F. Smith in the line of his previous investigations into what was referred to as, the "Zionist Occupied Government", an earlier, intuitive, non-systematic conjecture about Zionism's amazing insider access to, and influence of, U.S. foreign policy. It is interesting that Wikipedia describes the "ZOG" exclusively as an anti-semitic conspiracy theory attributed to a list of unsavory persons and racist organizations.

On the one hand, the American Israel Public Affairs Committee puts on a very public celebration every spring, the "policy conference", that is a pep rally of mandatory attendance by national Administration and Congressional people to celebrate Zionism. That event is public. But on the other hand, as Grant Smith analyzes, the "Israel Affinity Organizations" of the United States are a different archipelago.

As to what extent these organizations are legitimate lobbies, versus being mis-identified agents of a foreign power, I won't attempt to summarize, or, "give away" the content of the book; it is for people to read for themselves, to be informed, and to think for themselves.

Grant Smith presents numbers, names, and dates, to be reviewed and challenged by anyone who wants to. There is precedent for that. The USS Liberty attack by Israel was defended as a tragic mistake by author A. Jay Cristol, in his book, "The Liberty Incident". The Wiesenthal Center commissioned the author, Harold Brackman, to write, "Ministry of Lies, the Truth Behind the 'Secret Relationship Between Blacks and Jews' ". That referenced book was by the Nation of Islam. With facts in hand, the Electorate is empowered to make informed decisions about the US national interest, relative to Zionism.

Another good book is by Alison Weir on essentially the same subject, "Against Our Better Judgement, the Hidden History of How the U.S. Was Used to Create Israel". The Amazon servers, for that book review are loaded with discussions, which can be seen under that title. The Amazon book reviews are a valuable national resource that can be a place to survey public opinion, even with the factor that positives have less motivation than negatives to inspire writing an essay.

D L Neal , May 28, 2018
at least at this time- Wonderful, informative and important book

It is obvious why there is no middle ground in the reviews here, at least at this time- Wonderful, informative and important book. Read more One person found this helpful

Luther , May 15, 2016
"America. . . you can move very easily. . .." Netanyahu

No matter what your values -- Christian, Enlightenment, social justice, international law, natural law, the Kantian imperative, crimes against humanity, Judaism's own values (Israel "a light unto the nations" Isaiah 49:6) -- what Israel has done and is doing to the Palestinians is morally wrong.
Sure. People have done bad things to other people forever, but this evil is orchestrated by a large Zionist organization from all over the world. And the US is being made complicit in this immoral undertaking in the numerous ways Grant Smith explores in his book.

Exposing America's unfortunate entanglement is why he wrote this excellent book: 300 pages and 483 footnotes of support for the claims he makes.
The American democratic process is being corrupted at every level in the interests of Israel, and Smith gives chapter and verse on how this is being done.

One stunning example of this influence occurred recently. At one time during the nominating process for the Republican candidate for President in the current election, every single aspirant to the nomination made a pilgrimage to Las Vegas to kiss the money ring of Sheldon Adelson, whose only declared interests are Israel and online gambling. This is the same super-patriot Sheldon Adelson who wanted Mitt Romney to pardon Jonathan Pollard, should Romney become President with Adelson's financial backing.

In addition, Haim Saban of the Brookings Institution plays a similar role in the Democratic party. He has said: "I'm a one-issue guy, and my issue is Israel." He has promised to contribute as much money as needed to elect Hillary Clinton, someone who believes that Israel has a right to exist as a "Jewish state," with Jerusalem (an international city for millenia) as its capital (something no country in the world approves of, not even the USA).

  1. Is this the American democratic process in action?
  2. Is this what the Constitution intends?
  3. Is this our America?

Grant discusses in supported detail the areas of dual citizenship and dual loyalties (American citizens in the Israeli Defense Force); espionage (industrial and military); yearly billions to Israel with no benefit to the US; media control (no debating the facts of history; no Palestinians allowed to articulate and disseminate their narrative); tax exemption for money which goes to Jewish interests as well as the illegal settlements in Israel; perversion of education (forced Holocaust information but no discussion; anti-assimilation); foreign policy (the war with Iraq for the benefit of Israel; the demonization of Iran; no condemnation of Israel's nuclear capability in spite of the Non-Proliferation Treaty; use of the veto in the UN in Israel's interests; Middle East "regime change" wars); Israeli and Jewish influence in Congress (money, intense lobbying by AIPAC and free trips to Israel), and financial contributions only to candidates who are unequivocally pro-Israel, in some cases very large sums of money.

The point is that all of this is being done in spite of the wishes and best interests of the American people and even of Israel. It's not as though the American people voted to do bad things to the Palestinians: kill them, starve them, imprison them, steal from them, and control them. Quite the opposite: as Grant Smith explains, unbiased polls indicate that most Americans show no such support for Israel's mistreatment of the Palestinians and believe that if both sides would abide by international law, the Geneva Conventions, and the UN resolutions relating to Palestine, peace could be achieved between Jews and Arabs in Palestine.

But Zionism has a different agenda, an agenda that will use any means legal and illegal to promote its interests by getting the United States to back it up.
And that agenda is the problem because it is built on non-negotiable beliefs.

What can you say to someone who believes that the Bible mandates the establishment of a Jewish homeland in Palestine to the exclusion of the indigenous inhabitants?

Or, as Rabbi Ovaida Yosef said in 2010, that "The Goyim [non-Jews] are born only in order to serve us. Besides this, they have no place on earth -- only to serve the people Israel."

Not surprisingly, the never-ending "peace process" goes on and on, with no peace in sight.

The US, in spite of itself, continues to support this cruel charade against its own interests and at the expense of neighbors, friends, allies and innocent parties in Palestine and elsewhere in the world.

Grant Smith's excellent book is an attempt to raise America's awareness to the point that something might be done.

[May 13, 2019] America The Farewell Tour by Chris Hedges

Sep 05, 2018 | www.amazon.com
Chapter 1 - DECAY                                                1
Chapter 2 - HEROIN______________________________________________59
Chapter 3 - WORK________________________________________________83
Chapter 4 - SADISM_____________________________________________112
Chapter 5 - HATE_______________________________________________150
Chapter 6 - GAMВIING___________________________________________203
Chapter 7 - KKh KDOM___________________________________________230
Acknowledgments________________________________________________311
Notes----------------------------------------------------------315
Bibliography___________________________________________________351
Index----------------------------------------------------------359

I walked down a long service road into the remains of an abandoned lace factory. The road was pocked with holes Pilled with fetid water. There were saplings and weeds poking up from the cracks in the asphalt. Wooden crates, rusty machinery, broken glass, hulks of old Piling cabinets, and trash covered the grounds. The derelict complex, 288,000 square feet, consisted of two huge brick buildings connected by overhead, enclosed walkways.

The towering walls of the two buildings, with the service road running between them, were covered with ivy. The window panes were empty or had frames jagged with shards of glass. The thick wooden doors to the old loading docks stood agape. I entered the crumbling complex through a set of double wooden doors into a cavernous hall.

The wreckage of industrial America lay before me, home to flocks of pigeons that, startled by my footsteps over the pieces of glass and rotting floorboards, swiftly left their perches in the rafters and air ducts high above my head. They swooped, bleating and clucking, over the abandoned looms.

The Scranton Lace Company was America. It employed more than 1,200 workers on its imported looms, some of the largest ever built.

Gary Moreau, Author TOP 500 REVIEWER, September 5, 2018

Washington is fiddling but it is the capitalist collective that is setting the fires

Throughout history, all great civilizations have ultimately decayed. And America will not be an exception, according to former journalist and war correspondent, Chris Hedges. And while Hedges doesn't offer a date, he maintains we are in the final throes of implosion -- and it won't be pretty.

The book is thoroughly researched and the author knows his history. And despite some of the reviews it is not so much a political treatise as it is an exploration of the American underbelly -- drugs, suicide, sadism, hate, gambling, etc. And it's pretty dark; although he supports the picture he paints with ample statistics and first person accounts.

There is politics, but the politics provides the context for the decay. And it's not as one-dimensional as other reviewers seemed to perceive. Yes, he is no fan of Trump or the Republican leadership. But he is no fan of the Democratic shift to identity politics, or antifa, either.

One reviewer thought he was undermining Christianity but I didn't get that. He does not support "prosperity gospel" theology, but I didn't see any attempt to undermine fundamental religious doctrine. He is, after all, a graduate of Harvard Divinity School and an ordained Presbyterian minister.

He puts the bulk of the blame for the current state of decay, in fact, where few other writers do -- squarely on the back of capitalist America and the super-companies who now dominate nearly every industry. The social and political division we are now witnessing, in other words, has been orchestrated by the capital class; the class of investors, banks, and hedge fund managers who don't create value so much as they transfer it to themselves from others with less power. And I think he's spot on right.

We have seen a complete merger of corporate and political America. Politicians on both sides of the aisle serve at the pleasure of the capitalist elite because they need their money to stay in power. Corporations enjoy all the rights of citizenship save voting, but who needs to actually cast a ballot when you can buy the election.

And what the corpocracy, as I call it, is doing with all that power is continuing to reshuffle the deck of economic opportunity to insure that wealth and income continue to polarize. It's a process they undertake in the name of tax cuts for the middle class (which aren't), deregulation (which hurts society as a whole), and the outright transfer of wealth and property (including millions of acres of taxpayer-owned land) from taxpayers to shareholders (the 1%).

I know because I was part of it. As a former CEO and member of four corporate boards I had a front row seat from the 1970s on. The simplest analogy is that the gamblers rose up and took control of the casinos and the government had their backs in a kind of quid pro quo, all having to do with money.

They made it stick because they turned corporate management into the ultimate capitalists. The people who used to manage companies and employees are now laser focused on managing the companies' stock price and enhancing their own wealth. Corporate executives, in a word, became capitalists, not businessmen and women, giving the foxes unfettered control of the hen house.

They got to that position through a combination of greed -- both corporate management's and that of shareholder activists -- but were enabled and empowered by Washington. Beginning in the 1970s the Justice Department antitrust division, the Labor Department, the EPA, and other institutions assigned the responsibility to avoid the concentration of power that Adam Smith warned us about, and to protect labor and the environment, were all gutted and stripped of power.

They blamed it on globalism, but that was the result, not the cause. Gone are the days of any corporate sense of responsibility to the employees, the collective good, or the communities in which they operate and whose many services they enjoy. It is the corporate and financial elite, and they are now one and the same, who have defined the "me" world in which we now live.

And the process continues: "The ruling corporate kleptocrats are political arsonists. They are carting cans of gasoline into government agencies, the courts, the White House, and Congress to burn down any structure or program that promotes the common good." And he's right. And Trump is carrying those cans.

Ironically, Trump's base, who have been most marginalized by the corpocracy, are the ones who put him there to continue the gutting. But Hedges has an explanation for that. "In short, when you are marginalized and rejected by society, life often has little meaning. There arises a yearning among the disempowered to become as omnipotent as the gods. The impossibility of omnipotence leads to its dark alternative -- destroying like the gods." (Reference to Ernest Becker's The Denial of Death.)

The economic history and understanding of economic theory here is rich and detailed. Capitalism, as Marx and others pointed out, creates great wealth in the beginning but is doomed to failure due to its inability to continue to find sources of growth and to manage inequities in wealth creation. And you don't have to be a socialist to see that this is true. Capitalism must be managed. And our government is currently making no attempt to do so. It is, in fact, dynamiting the institutions responsible for doing so.

All told, this is a very good book. If you don't like reading about underbellies (I found the chapter devoted to sadism personally unsettling, being the father of two daughters.) you will find some of it pretty dark. Having said that, however, the writing is very good and Hedges never wallows in the darkness. He's clearly not selling the underbelly; he's trying to give it definition.

I did think that some of the chapters might have been broken down into different sub-chapters and there is a lack of continuity in some places. All told, however, I do recommend the book. There is no denying the fundamental thesis.

The problem is, however, we're all blaming it on the proverbial 'other guy.' Perhaps this book will help us to understand the real culprit -- the capitalist collective. "The merging of the self with the capitalist collective has robbed us of our agency, creativity, capacity for self-reflection, and moral autonomy." True, indeed.


S. Ferguson , September 1, 2018

"Justice is a manifestation of Love..."

The inimitable Hedges is not only a saint with a penetrating intelligence, but also a man of superior eloquence with the power to pull you into his descriptions of the collapse of western civilization. Hedges says that the new American Capitalism no longer produces products -- rather America produces escapist fantasies. I found this paragraph [page 233] particularly relevant. The act of being dedicated to the 'greater good' has in itself become dangerous.

Chris Hedges: "We do not become autonomous and free human beings by building pathetic, tiny monuments to ourselves. It is through self-sacrifice and humility that we affirm the sanctity of others and the sanctity of ourselves. Those who fight against cultural malice have discovered that life is measured by infinitesimal and often unacknowledged acts of solidarity and kindness. These acts of kindness spin outward to connect our atomized and alienated souls to others. The good draws to it the good. This belief -- held although we may never see empirical proof -- is profoundly transformative. But know this: when these acts are carried out on behalf of the oppressed and the demonized, when compassion defines the core of our lives, when we understand that justice is a manifestation of love, we are marginalized and condemned by our sociopathic elites."

Amazon Customer , September 7, 2018
Great (Recycled) Hedges Rants

If you've never read Hedges - get it now. If you've read him before - there's nothing new here.

Chris Hedges is a writer who has a knack for seeing the big picture and connecting the dots. A chronic pessimist in the best sense, a bitter prophet warning us of the last days of the decaying empire, his page-turning prose carving through the morass of today's mania and derangement. For that, he's in the company somewhere between Cornel West and Morris Berman (the later, whose book Why America Failed, is better than this. If you're familiar with Hedges, but not Morris Berman, go find Berman instead).

I give this three stars only because there isn't much new here if you're familiar with his material. I felt this book to be an update of Empire of Illusion, punched up by old articles from his weekly column at Truthdig. Aside from the introductory chapter, he revisits themes of sadism, the decline of literacy, of labor, of democratic institutions, and so on, which are too familiar. The pages and pages detailing the BDSM craze I felt were excessive in their prurient voyeurism which journalistic approaches can fall into. Not saying he's wrong at all, but this tone could put off some readers, erring on excessive preacherly seminarian virtue signaling as he points out the sins of the world and shouts - "Look! Look at what we've done!"

swisher , August 21, 2018
I'd give a million stars if possible

Heartbreaking to read but so true. In our "truth is not truth" era Mr. Hedges once again writes the sad and shocking obituary for American Democracy and sounds the prophetic alarm to those revelers while Rome burns. All empires come and go but I never thought I'd be a witness to one. Something sick and traitorous has infected the soul of America and I fear it's going to be some demented combination of the worst elements in 1984 and Brave Bew World. The most important work currently published but will anyone listen? Will anything change?

ChrisD , September 5, 2018
Well worth reading - an important perspective

The author is honest and intelligent. When you take a detailed look at reality it can seem harsh.

Don't shoot the messenger who has brought bad news. We need to know the truth. Read, listen, learn. Engage in positive actions to improve the situation.
Chris has given us a wake-up call.

[May 11, 2019] A Texan Looks At Lyndon: A Study In Illegitimate Power

May 31, 2003 | www.amazon.com

Kurt Harding

A Devastating Diatribe, May 31, 2003

It would be an understatement to say that author Haley does not like Lyndon Baines Johnson. And despite the fact that his book is an unrelenting tirade against all things Lyndon, it provides a useful service in reminding the reader of how Johnson trampled and double-crossed friend and foe alike in his single-minded lust for power.

I am fairly conservative politically, but I am open-minded enough to recognize and oppose corruption whether practiced by liberals or conservatives. In my lifetime, Johnson, Nixon, and Clinton have been shining examples of the worst impulses in American presidential politics in which greed and lust for either power or money ended up overshadowing any of their real achievements.

Haley shows that Johnson was a man of few real principles, neither liberal nor conservative, but rather a man who usually always wanted to know which way the wind was blowing before taking a stand on any important issue. Johnson was a man who used all his powers of persuasion and veiled threats to get what he wanted and woe unto anyone who stood in his way.

He was a man who knew and used the old adage "It's not what you know, but who you know" to Machiavellian extremes.

But he was also a man of sometimes great political courage who would rarely give an inch once he took a stand. He hated those who opposed him, nursed resentments, and wreaked revenge on those who crossed him in the least as most of his enemies and many of his friends learned to their sorrow. From the earliest days, he was involved with corrupt Texas politicians from the local to the state level and swam in the seas of corporate corruption with the likes of the infamous swindler Billy Sol Estes and others of his stripe.

Admittedly, the conservatism of the author is the conservatism of a bygone age and the reader will recognize that the book is meant to be a partisan attack on Johnson. Some of the attacks on Johnson are made solely for political reasons as Johnson was clever enough to outmaneuver Haley's ideological brothers and sisters. But Johnson surrounded himself with enough scummy characters and got involved in so many underhanded political AND business deals that he deserves the rough treatment given him in Haley's devastating diatribe.

No matter your political leanings, your eyes will be opened when you read A Texan Looks At Lyndon. The book is well-written and often riveting in its allegations and revelations, but it loses one star for occasional hysteria. If US or Texas politics interests you, then I highly recommend this.

Randall Ivey

You have been warned, July 31, 2000

Haley wrote this book (and published it himself) in 1964 basically as a campaign tract for Barry Goldwater. In the intervening years it has become a classic of its kind, a philippic, to use M.E. Bradford's term, tracing the illegitimate rise to power of Lyndon Baines Johnson.

If you're politically naive, this book will grown hair on your chest. It's an unblinking, fearless portrait of Johnson's wheeling dealing and underhanded methods to achieve the power, prestige, and money he craved all his life.

Haley names all the names and lays out facts and figures for the reader to make up his mind. And the reader winds up shaking his head in utter astonishment. The best part of the book is that detailing Johnson's eventual election to the U.S. Senate in a contest with former Gov. Coke Stevenson.

The election was clearly Stevenson's, but through the machinations of George Parr, the notorious Duke of Duval County, the results were turned around in LBJ's favor. Investigators later found that among those voting in the primary were people who didn't live in the county anymore and people who weren't alive at all. But the results stood.

(An interesting and amusing aside: when Haley ran for Texas governor in 1956, he approached Parr and said, "I'm Evetts Haley. I'm running for governor, and if I win, it will be my privilege to put you in jail."

Parr's reply: "I believe you will." Parr, the Artful Dodger of Texas politics for years, eventually killed himself.)

At times the book grows tiresome, especially in the Bobby Baker and Billie Sol Estes scandals, where Haley turns a virtual torrent of names and numbers on the reader as to be sometimes confusing.

[Apr 23, 2019] The Secret Team The CIA and Its Allies in Control of the United States and the World by L. Fletcher Prouty

Notable quotes:
"... The CIA is the center of a vast mechanism that specializes in Covert Operations ... or as Allen Dulles used to call it, "Peacetime Operations." ..."
"... the CIA is the willing tool of a higher level Secret Team, or High Cabal, that usually includes representatives of the CIA and other instrumentalities of the government, certain cells of the business and professional world and, almost always, foreign participation. It is this Secret Team, its allies, and its method of operation that are the principal subjects of this book. ..."
"... vast intergovernmental undercover infrastructure and its direct relationship with great private industries, mutual funds and investment houses, universities, and the news media, including foreign and domestic publishing houses. The Secret Team has very close affiliations with elements of power in more than three-score foreign countries and is able when it chooses to topple governments, to create governments, and to influence governments almost anywhere in the world. ..."
"... the power of the Team is enhanced by the "cult of the gun" and by its sometimes brutal and always arbitrary anti-Communist flag waving, even when real Communism had nothing to do with the matter at hand. ..."
"... To be a member, you don't question, you don't ask; it's "Get on the Team" or else. One of its most powerful weapons in the most political and powerful capitals of the world is that of exclusion. To be denied the "need to know" status, like being a member of the Team, even though one may have all the necessary clearances, is to be totally blackballed and eliminated from further participation. Politically, if you are cut from the Team and from its insider's knowledge, you are dead. In many ways and by many criteria the Secret Team is the inner sanctum of a new religious order. ..."
"... At the heart of the Team, of course, arc a handful of top executives of the CIA and of the National Security Council (NSC), most notably the chief White House adviser to the President on foreign policy affairs. ..."
"... It is often quite difficult to tell exactly who many of these men really are, because some may wear a uniform and the rank of general and really be with the CIA and others may be as inconspicuous as the executive assistant to some Cabinet officer's chief deputy. ..."
"... even more damaging to the coherent conduct of foreign and military affairs, it is a bewildering collection of semi-permanent or temporarily assembled action committees and networks that respond pretty much ad hoc to specific troubles and to flash-intelligence data inputs from various parts of the world, sometimes in ways that duplicate the activities of regular American missions, sometimes in ways that undermine those activities, and very often in ways that interfere with and muddle them. ..."
"... This report is a prime example of how the Secret Team, which has gained so much control over the vital foreign and political activities of this government, functions. ..."
"... Although even in his time he had seen the beginning of the move of the CIA into covert activities, there can be little doubt that the "diversion" to which he made reference was not one that he would have attributed to himself or to any other President. Rather, the fact that the CIA had gone into clandestine operations and had been "injected into peacetime cloak-and-dagger operations," and "has been so much removed from its intended role" was more properly attributable to the growing and secret pressures of some other power source. As he said, the CIA had become "a symbol of sinister and mysterious foreign intrigue." ..."
Apr 23, 2019 | www.amazon.com

I was the first author to point out that the CIA's most important "Cover Story" is that of an "Intelligence" agency. Of course the CIA does make use of "intelligence" and "intelligence gathering," but that is largely a front for its primary interest, "Fun and Games." The CIA is the center of a vast mechanism that specializes in Covert Operations ... or as Allen Dulles used to call it, "Peacetime Operations."

In this sense, the CIA is the willing tool of a higher level Secret Team, or High Cabal, that usually includes representatives of the CIA and other instrumentalities of the government, certain cells of the business and professional world and, almost always, foreign participation. It is this Secret Team, its allies, and its method of operation that are the principal subjects of this book.

It must be made clear that at the heart of Covert Operations is the denial by the "operator," i.e. the U.S. Government, of the existence of national sovereignty. The Covert operator can, and does, make the world his playground ... including the U.S.A. Today, early 1990, the most important events of this century are taking place with the ending of the "Cold War" era, and the beginning of the new age of "One World" under the control of businessmen and their lawyers, rather than the threat of military power. This scenario for change has been brought about by a series of Secret Team operations skillfully orchestrated while the contrived hostilities of the Cold War were at their zenith.

... ... ...

We may wish to note that in a book "Gentleman Spy, the Life of Allen Dulles" the author, Peter Grose cites Allen Dulles response to an invitation to the luncheon table from Hoover's Secretary of State, Henry L. Stimson. Allen Dulles assured his partners in the Sullivan & Cromwell law firm, "Let it be known quietly that I am a lawyer and not a diplomat." He could not have made a more characteristic and truthful statement about himself. He always made it clear that he did not "plan" his work, he was always the "lawyer" who carried out the orders of his client whether the President of the United States, or the President of the local bank.

The Secret Team (ST) being described herein consists of securitycleared individuals in and out of government who receive secret intelligence data gathered by the CIA and the National Security Agency (NSA) and who react to those data, when it seems appropriate to them, wide paramilitary plans and activities, e.g. training and "advising" -- a not exactly impenetrable euphemism for such things as leading into battle and actual combat -- Laotian tribal troops, Tibetan rebel horsemen, or Jordanian elite Palace Guards.

Membership on the Team, granted on a "need-to-know" basis, varies with the nature and location of the problems that come to its attention, and its origins derive from that sometimes elite band of men who served with the World War II Office of Strategic Services (OSS) under the father of them all, General "Wild Bill" William J. Donovan, and in the old CIA.

The power of the team derives from its vast intergovernmental undercover infrastructure and its direct relationship with great private industries, mutual funds and investment houses, universities, and the news media, including foreign and domestic publishing houses. The Secret Team has very close affiliations with elements of power in more than three-score foreign countries and is able when it chooses to topple governments, to create governments, and to influence governments almost anywhere in the world.

Whether or not the Secret Team had anything whatsoever to do with the deaths of Rafael Trujillo, Ngo Dinh Diem, Ngo Dinh Nhu, Dag Hammarskjold, John F. Kennedy, Robert F. Kennedy, Martin Luther King, and others may never be revealed, but what is known is that the power of the Team is enhanced by the "cult of the gun" and by its sometimes brutal and always arbitrary anti-Communist flag waving, even when real Communism had nothing to do with the matter at hand.

The Secret Team docs not like criticism, investigation, or his- tory and is always prone to see the world as divided into but two camps -- "Them" and "Us." Sometimes the distinction may be as little as one dot, as in "So. Viets" and "Soviets," the So. Viets being our friends in Indochina, and the Soviets being the enemy of that period. To be a member, you don't question, you don't ask; it's "Get on the Team" or else. One of its most powerful weapons in the most political and powerful capitals of the world is that of exclusion. To be denied the "need to know" status, like being a member of the Team, even though one may have all the necessary clearances, is to be totally blackballed and eliminated from further participation. Politically, if you are cut from the Team and from its insider's knowledge, you are dead. In many ways and by many criteria the Secret Team is the inner sanctum of a new religious order.

At the heart of the Team, of course, arc a handful of top executives of the CIA and of the National Security Council (NSC), most notably the chief White House adviser to the President on foreign policy affairs. Around them revolves a sort of inner ring of Presidential officials, civilians, and military men from the Pentagon, and career professionals of the intelligence community. It is often quite difficult to tell exactly who many of these men really are, because some may wear a uniform and the rank of general and really be with the CIA and others may be as inconspicuous as the executive assistant to some Cabinet officer's chief deputy.

Out beyond this ring is an extensive and intricate network of government officials with responsibility for, or expertise in, some specific field that touches on national security or foreign affairs: "Think Tank" analysts, businessmen who travel a lot or whose businesses (e.g., import-export or cargo airline operations) are useful, academic experts in this or that technical subject or geographic region, and quite importantly, alumni of the intelligence community -- a service from which there are no unconditional resignations. All true members of the Team remain in the power center whether in office with the incumbent administration or out of office with the hard-core set. They simply rotate to and from official jobs and the business world or the pleasant haven of academe.

Thus, the Secret Team is not a clandestine super-planning-board or super-general-staff. But even more damaging to the coherent conduct of foreign and military affairs, it is a bewildering collection of semi-permanent or temporarily assembled action committees and networks that respond pretty much ad hoc to specific troubles and to flash-intelligence data inputs from various parts of the world, sometimes in ways that duplicate the activities of regular American missions, sometimes in ways that undermine those activities, and very often in ways that interfere with and muddle them. At no time did the powerful and deft hand of the Secret Team evidence more catalytic influence than in the events of those final ninety days of 1963, which the "Pentagon Papers" were supposed to have exposed. The New York Times shocked the world on Sunday, June 13,1971, with the publication of the first elements of the Pentagon Papers.

The first document the Times selected to print was a trip report on the situation in Saigon, credited to the Secretary of Defense, Robert S. McNamara, and dated December 21,1963. This was the first such report on the situation in Indochina to be submitted to President Lyndon B. Johnson. It came less than thirty days after the assassination of President John F. Kennedy and less than sixty days after the assassinations of President Ngo Dinh Diem of South Vietnam and his brother and counselor Ngo Dinh Nhu.

Whether from some inner wisdom or real prescience or merely simple random selection, the Times chose to publish first from among the three thousand pages of analysis and tour thousand pages of official documents that had come into its hands that report which may stand out in history as one of the key documents affecting national policy in the past quarter-century -- not so much for what it said as for what it signified. This report is a prime example of how the Secret Team, which has gained so much control over the vital foreign and political activities of this government, functions.

... ... ...

...President Harry S. Truman, observing the turn of events since the death of President Kennedy, and pondering developments since his Administration, wrote for the Washington Post a column also datelined December 21, 1963:

For some time I have been disturbed by the way the CIA has been diverted from its original assignment. It has become an operational and at times a policy-making arm of the government.... I never had any thought that when I set up the CIA that it would be injected into peacetime cloak-and-dagger operations.

Some of the complications and embarrassment that I think we have experienced arc in part attributable to the fact that this quiet intelligence arm of the President has been so removed from its intended role that it is being interpreted as a symbol of sinister and mysterious foreign intrigue and a subject for cold war enemy propaganda.

Truman was disturbed by the events of the past ninety days, those ominous days of October, November, and December 1963. Men all over the world were disturbed by those events. Few men, however could have judged them with more wisdom and experience than Harry S. Truman, for it was he who, in late 1947, had signed unto law the National Security Act. This Act, in addition to establishing the Department of Defense (DOD) with a single Secretary at its head and with three equal and independent services -- the Army, Navy, and Air Force -- also provided for a National Security Council and the Central Intelligence Agency. And during those historic and sometimes tragic sixteen years since the Act had become law, he had witnessed changes that disturbed him, as he saw that the CIA "had been diverted" from the original assignment that he and the legislators who drafted the Act had so carefully planned.

Although even in his time he had seen the beginning of the move of the CIA into covert activities, there can be little doubt that the "diversion" to which he made reference was not one that he would have attributed to himself or to any other President. Rather, the fact that the CIA had gone into clandestine operations and had been "injected into peacetime cloak-and-dagger operations," and "has been so much removed from its intended role" was more properly attributable to the growing and secret pressures of some other power source. As he said, the CIA had become "a symbol of sinister and mysterious foreign intrigue."


5.0 out of 5 stars XXX

The New Corporate (non-State acting) Privatized One World Order December 4, 2012

While we sit stunned into complete disbelief and silence trying to make sense of, understand, and decode the strongly suspected connections between the most curious political and military events of our times, this author, Colonel, L. Fletcher Prouty, in this book, "The Secret Team," has already decoded everything for us. From the JFK assassination, Watergate, the Iran-Contra Affair, the Gulf of Tonkin incident, repeated bank bust-outs (like BCCI and Silverado), the cocaine connection from Mena Arkansas to Nicaragua, the "crack" cocaine explosion in America's inner cities, the recent housing crash, and the general Wall Street sponsored financial meltdown, and now even from the wildest recesses of our collective imagination (dare I say it, maybe even 911?), Colonel Prouty, the fabled Mr. "X" in the movie "JFK," has the bureaucratic structure of all the answers here.

What Colonel Prouty tells us is that right before our own eyes, we are experiencing a paradigm shift in international relations and world affairs, one that has quietly moved us from the "old order" where the sovereign nation and its armies and national ideologies once sat at the center of world events and predominated, into a new "One World business run corporate, privatized global order," in which "the corporate powers that be" sit on the throne in the clock tower; and where, as a result of their machinations, true national sovereignty has seeped away to the point that we say safely say, it no longer exists.

The good Colonel tells us that the most important events of this century are taking place right before our eyes, as the Cold War era has already given way to a new age of "One World" under the control of businessmen and their hired guns, their lawyers -- rather than under the threat of military power and ideological differences. In this new, completely "privatized world order," big business, big lawyers, big bankers, big politicians, big lobbyists, and even bigger money-men, run and rule the entire world from behind a national security screen inaccessible to the average citizen. It is this paradigm shift, and the wall of secrecy that has brought us the "Secret Team" and the series of strange inexplicable events that it has skillfully orchestrated, and that keep recurring from time to time both within the U.S. and throughout the world.

This new bureaucratic entity is called a "Secret Team" for good reasons: because like any team, it does not create its own game plan, its own rules, or its own reality. The team plays for a coach and an owner. It is the coach and the owner that writes the scripts, creates and "calls" the plays. The drama of reality that we see on the international screen is a creation of the "Power elite, as it is executed by the "secret Team." The power of the team comes from its vast intergovernmental undercover infrastructure and its direct relationship with private industries, the military, mutual funds, and investment houses, universities, and the news media, including foreign and domestic publishing houses. The beauty of the "Secret team," is that it is not a clandestine super-planning-board, or super-general staff like as is frequently attributed to the Bilderburg Group, or the Trilateral Commission, but is a bewildering collection of ad hoc and semi-permanent action committees and networks that can come into being and then dissolve as specific needs troubles and flash-points dictate. It can create, influence or topple governments around the globe at the behest and on the whim of its coaches, "the Power Elite."

As the Sociologist C. Wright Mills told us nearly a half century ago, the members of the "Power Elite," operate beyond national borders, beyond the reach of the public, and have no national loyalties -- or even return addresses. They operate in the shadows and run the world by remote control and by making us completely dependent upon them and their hidden machinations. Invisibly, they maneuver and jockey to control every aspect of our lives and the infrastructure and markets upon which we depend for our survival: The most important and essential among them being our ability to produce and distribute our own food, water, and energy. As a result of this dependency, and despite mythology to the contrary, Colonel Prouty tells us that we are becoming the most dependent society that has ever lived. And the future viability of an infrastructure that is not controlled and manipulated by this "global power Elite," is diminishing to the point of non-existence.

With climate changes and terrorism already causing serious disruptions in the normal flow of our lives, governments are becoming less and less able to serve as the people's protector of last resort. Already, one of the politicians who ran for President of the United States in its most recent election, Governor Mitt Romney, suggested that FEMA be turned over to a private run firm? And all of the agencies of government that he did not suggest be privatized (or that have not already been privatized), except for the military, he suggested be abolished. As well, we also see the concomitant rise of the Backwaters' of the world, a private firm that has already begun to take over a lion's share of the responsibilities of our volunteer military. Likewise, our prisons, healthcare system and schools are also being privatized, and everything else is being "outsourced" to the lowest bidder on the global labor market. The book however is not just about international politics or international economics, per se, but is also about the primary bureaucratic instrumentality through which the "Power Elite" operates. This instrumentality, as noted above, is called "the Secret Team."

How does Colonel L. Fletcher Prouty know about the "Secret Team:" because he used to be one of its Pentagon operational managers. I believe then that out of prudence, when the man who oversaw management of and liaised with "the Secret team" for nine years as a Pentagon as an Air Force Colonel, (and who incidentally was also sent on a wild goose chase to Antarctica in order to get him out of the country, days before the JFK assassination), tells us that something is wrong in Denmark, I believe it is high time to listen up. In a chilling narrative, Colonel Prouty relates to us how he found out about the assassination of JFK. It was during a stopover in New Zealand on his return from the wild goose chase his superiors had sent him on to get him out of the way. Hours BEFORE the assassination had even occurred, somehow the New Zealand press already had the pre-planned talking points on Lee Harvey Oswald. Somehow they mistakenly deployed them prematurely, reporting well in advance of the assassination itself, that Oswald was JFK's lone assassin? How could such a thing happen unless there was a very high level conspiracy?

The Secret team, according to Prouty consists of a bunch of renegade CIA intelligence operatives that are signed up for life and operate under the full protection and pay of the "Power Elite," itself a cabal of wealthy men with interlocking interests beholden only to their own hunger for power, profit and greed. The "Power Elite" relies upon this covert team of highly trained specialists to get things done without questions being asked and without moral squeamishness.

Operating outside the normal parameters of political authorization, morality, direction, and law, and hiding behind a wall shielded by national security secrecy, very much like the mafia, the "Secret Team" always gets the job done. They are allowed to ply their immoral trade with both impunity and with legal immunity. In short, in the modern era, in the new "One WorldCorporate Order," they have proven again and again that, at worse they are lawless, and at best, they are a law unto themselves. The members of the "Secret Team" have become the new Jack-booted foot soldiers we see trampling over our dying democracy. As we move deeper and deeper into the uncharted realms of the new Corporate run "One World Order," "we the people" have a lot of questions we must ask ourselves if the democracy we once knew is to endure.

The climax of the book appears here in chapter 22 ( entitled "Camelot.") It is a beautifully crafted object lesson for the future of what remains of our democracy. It is a narrative summary of how JFK tried but failed to deal with the emerging paradigm shift in power from the Executive branch of the UGS, to the CIA and the "Secret Team," that is to say, from a system of duly elected Representatives to one dictated by the whims of the "Power Elite" through their "Secret Team." JFK's assassination is just the most dramatic consequence of how our then young President failed to save the USG from usurpation of its power by a cabal of anonymous evil men intent on ruling the world. Colonel Prouty's story ends somewhat as follows.

The Bay of Pigs operation was the seminal event in the clandestine transfer of power from the "normal government" to the CIA's Secret Team." It was done primarily via the thinly transparent interface of the military -- playing a dual role as both military officers reporting to their Commander in Chief, and at the same time as undercover "clandestine operatives" reporting (behind the President's back) to the CIA (and of course through it, to the "Power Elite."). In the book, there is little question where their split loyalties lay.

The key ruse that provided the glue that made this high level "grifter-like scam" (with the U.S. President, as its "mark)" work to perfection, was the words "anti-Communist counterinsurgency." Put to skilful use in hands of trained Specialists, these words had a powerful and purposeful dual meaning. They meant one thing to "clandestine insider members of the "Secret Team," and quite another to "no need to know outsiders" like the American public (and in this case the whole USG, including the Commander in Chief, the President of the U.S. JFK himself). This willful ambiguity in terminology and the duality in the roles of those involved does most of the heavy lifting in the drama played out by the "insiders" and that resulted in the usurpation and the shift of power from the Presidency to the CIA.

The "Bay of Pigs operation"proved to be the defining, the seminal and pivotal case in point. It began as a small clandestine "anti-Communist counterinsurgency" operation run by the CIA (as also was the case with Iran, Guatemala, Nicaragua, Indonesia, Laos, Cambodia, Granada, Angola, and Santo Domingo), ostensibly under the oversight of the "USG," but in fact ended up as a huge CIA-run military failure, one minus the requisite oversight from the US President. The devil of how this happened lies in the slimy details that went on behind the scenes and that are skillfully unveiled in this book. They are details that the reader can also get from a careful reading between the lines of "The Pentagon Papers."

As the Bay of Pigs Operation slowly morphed from a small-scale USG run operation "with oversight," into a huge, expensive and poorly run CIA operation without any oversight whatsoever, the rules of the game also changed. They changed from being about U.S. security, to being about the greed, power and profits of the "Power Elite, as those objectives were implemented through the "Secret Team." The key to the "Power Elite" getting what they wanted was always accomplished by stoking the ideological fires up to an international boiling point, so that more and more military hardware could be produced, bought and sold.

Likewise, the roles of the primary players also morphed and changed -- from "clandestine operators" in military uniforms, to "military operators" reporting to their CIA handlers, and thus to the "Power Elite." The executive branch (the ostensible oversight body of the government) was none the wiser, since it was not yet aware that it was "being played" by the CIA and thus did not yet know it was being treated in the same way the public is normally treated: as an "excluded outsider" lacking the required "need to know."

Through this bureaucratic sleigh of hand, the partial control and power the USG normally exercised in its oversight role had been covertly usurped, as the military operators (and even members of the Presidents own staff proved to be "insiders," i.e., members of the "Secret Team," "playing" the President like a bass fiddle as he and his team became the "marks" in an insider's "con game" in which power and control of the USG was at stake.

When JFK finally "wised up," it was too late. By then the train had already left the station, with the CIA firmly in the driver's seat. Since JFK era, U.S. foreign policy has become a clear case of the CIA tail wagging the USG dog. And the best proof of the evil intentions of the "Secret Team" calling the shots within the CIA is that no sooner than the Bay of Pigs literally blew up in a spectacular and embarrassing failure did the CIA then put the wheels back in motion to duplicate, expand and even generalize this failed bureaucratic formulate in Vietnam.

But this time JFK was ready for them and issued NSM-55 and NSM-57, both of which were decision directives designed to put the brakes on the CIA and return the usurped power back to the military where the President was the Commander in Chief. But the CIA was already two steps ahead of JFK. His own staff had been so compromised that he had nowhere to turn? He was penetrated and thus effectively checkmated by an agency of his own government? The more he fought back, the more he lost ground, and the more his back was up against the wall. By the time November, 22, 1963 came around, JFK literally had no bureaucratic friends and nowhere to turn?

I only regret that an earlier edition of this book had been lying around unread in my library for more than a decade. Five Stars.

Stephen Courts , August 7, 2012

Secret Team (CIA) By Colonel Fletcher Prouty

Though this book is now over 40 years old, I found the information very relevant and 100% trustworthy from one of America's true Patriots. Colonel Prouty served his country for decades as a pilot and as an integral part of the Department of Defense and CIA. Though for nine years Colonel Prouty was the liason between the Air Force and the CIA's clandestine affairs, he is able to reveal confidential information that would typically be classified "Top Secret", because Colonel Prouty did not work for the CIA and therefore did not have to sign a confidentiality agreement with the nefarious CIA.

What is fascinating about Colonel Prouty is that he was everywhere throughout his career. He watched world affairs as they unfolded, meeting the most influencial leaders of his time. From FDR, Stalin, Churchill, Ike and every general and admiral in our military. For the nine years from 1954 to 1963, he was involved as the go to guy for the military leaders and the president, including both Ike and JFK. In other words, Colonel Prouty writes from personal and direct experience.

Now the meat of the book is about the creation and abuses of the 1947 created CIA. From the end of World War Two until the mid 1970's, the CIA abused its primary responsibility of intelligence gathering to literally unchecked clandestine and covert upheavels in every part of the world. The CIA, particularly under Allen Dulles, created one coup d'etat after another. The reader will realize that from 1945 until the Marines reached the shores of Viet Nam in 1965, every piece of skulldruggery in Viet Nam was done by the CIA. The CIA had infiltrated the entire government, from the Department of Defense to the Department of State. Many people would be shocked to know that what passed as Defense activity was acually generals and admirals, wearing their uniforms and working for the CIA. Whether it was advising the President, subverting Ambassadors or lying to Congress, the CIA ruled and few knew what they were really doing. Colonel Prouty tells the stories accurately of every subversive, nefarious act the CIA was involved in. One example in particular stands out. It was Ike's goal at the end of his 2nd term as president to have a peace conference with the USSR, one to sign a peace treaty and end the cold war. In direct violation of the presidents specific instructions not to fly U-2 flights prior to the conference in June of 1960, the CIA flew the ill fated Gary Powers flight that guaranteed that the conference would go forth. This was a most important conference that could have brought nuclear peace accords decades before they were eventually signed. Dulles and his henchmen deliberately insured that Gary Powers not only violated the order not to fly these observations flights, they insured that it would be downed by sabotaging the flight and thus force Ike to either admit he knew or fire the bastards who embarrassed him. Ike chose to take responsibility and thus the peace talks were cancelled. There was also another flight in 1958 that was downed in the Soviet Union.

Most Americans would be shocked to know the CIA has their own private air lines, Air America. This is no small air lines. Had Colonel Prouty written this book later, he could connect the CIA with the massive drug smuggling that has devastated American cities. They use the proceeds of this smuggling to finance their illicit involvement of other sovereign countries.

Bottom line is this is an important book as is his 1993 JFK & Viet Nam. Colonel Prouty was a significant advisor to Oliver Stone and his masterpiece, JFK. I am currently finishing the rereading of said book. If you want to know who has controled our foreign policy (against the charter that created this monstrosity) since the mid 1940's, this is an excellent book to begin with. It is my personal opinion, having read many books on the CIA, that their main function is to serve the multi-national corportations and the bankers that exploit the less developed countries around the world and to insure that there will never be peace. There will not be a World War Three, because nuclear weapons would most likely be used and earth as we know it will cease to exist. Therefore, limited, no win conflicts will continually persist. Beginning with Korea, to Viet Nam, to Iraq to Afganistan. The irony is we are wasting our human resources and our treasury to bankrupt our country while both Russia and China sit back and spend zero (USSR & Afganistan is the exception) and develope the kind of infrastruture and consumer goods as well as education that we should be doing.

Finally, the record of the CIA leaves a lot to be desired. There were many failures despite billions of dollars spent and the infiltration into every branch of our society, from education to media to think tanks to the military. Read this book and you will also discover the misadventure in Viet Nam that cost 58,000 plus American casualities, millions of Viet Namese, millions of service men who would never be the same after this debacle. Colonel Prouty explains this better than anyone I have yet to read. He predicted another debacle (Iraq & Afganistan) after the Viet Nam debacle. I believe Cononel Prouty passed away last decade, but he would not have been shocked by the rediculous misadventures in both of the above foremetioned countries. Think of the trillions of dollars and the bloodshed lost on a military misadventure that has no way of producing a positive outcome for the United States.

Stephen Courts
August 7, 2012

<img src="https://images-na.ssl-images-amazon.com/images/I/11bV+h26SGL._CR0,0,115,115_SX48_.jpg"> Jeff Marzano , December 17, 2014
An American Hero Reveals The Shocking Truth

This book provides a rare glimpse into the secret history and evil machinations of the CIA as it mutated from its original form between 1946 up until the time the book was published in 1973 when it had become a cancerous blight within the government.

It should not be surprising that most people never really understood the so called Vietnam War and they still don't. Even people in the American government like the Secretary Of Defense were completely confused and manipulated by the Agency as it's called.

President Kennedy was somewhat inexperienced when he first entered office. JFK thought he could handle problems in the government in the same way he handled problems during his presidential campaign. He had an informal style at first where he would just ask a friend to take care of it. This caused JFK to disregard important checks and balances which had been set up to hopefully prevent the CIA from crossing the line from being just an intelligence agency into the realm of initiating clandestine military operations.

The National Security Counsel was supposed to give direction to the CIA and then the Operations Coordination Board was supposed to verify that the CIA had done what they were told and only what they were told. But even before JFK got into office the Agency had taken many determined steps to undermine those controls.

JFK's informal style opened the door even wider for the Agency to circumvent whatever controls may have still been effective to put some sort of limits on their 'fun and games'. Having an informal style with them was dangerous because they were experts at getting around all sorts of rules and laws.

The Agency double crossed JFK during the Bay Of Pigs debacle. Publicly JFK took the blame for what happened but according to Fletcher it was the CIA who cancelled the air support that would have destroyed Fidel Castro's planes on the ground. As a result JFK's only options were to accept the blame or admit to the world that things were being done by the American military establishment that he wasn't even aware of. John Kennedy was a fast learner however and he stated that he would break the CIA up into a thousand tiny pieces. JFK was fed up with all of the Agency's fun and games.

Something similar happened with the Gary Powers U2 spy plane that had to land in the Soviet Union. The evil Secret Team sabotaged the U2 to derail President Eisenhower's lifelong dream of holding a worldwide peace summit. Like JFK Ike accepted the blame publicly.

Ike's only other option would have been to admit that the U2 flight was unauthorized and then fire Allan Dulles and the other leaders of the evil Secret Team. But Fletcher says Ike couldn't do this for various reasons even though Nikita Khrushchev probably realized that Eisenhower did not break his word and authorize the U2 mission.

Ike's comments about the Military Industrial Complex which he made during his farewell address turned out to be very prophetic indeed.

These examples provide the picture of an Agency that had become a law unto itself which reinterpreted whatever orders it was given to make those orders conform to their evil schemes. Fletcher provides many details in the book about how the Agency was able to circumvent laws and regulations and manipulate anyone and everyone in the government starting with the president. They did this mainly by abusing their control of secrecy but they used many other methods as well.

Secret Team leader Allan Dulles wrote a book called 'The Craft of Intelligence'. The title of this book sort of indicates the very problem Fletcher Prouty explains in his book. Dulles viewed himself as a sort of artist or craftsman who could distort information and make it appear in any form he wanted. Strangely Fletcher refers to his close personal friendship with Allan Dulles in the acknowledgements at the beginning of the book but then spends the rest of the book portraying Dulles as a sort of Joseph Goebbels figure.

Fletcher spends over 300 pages describing the metamorphosis which occurred with the CIA as it veered very far afield from what president Truman had intended when he created the Agency. Then towards the end of the book Fletcher finally reveals his shocking conclusions about what this massive abuse of power lead to.

Fletcher felt that the assassination of president Kennedy was the single most pivotal event in modern American history as far as the changes that the assassination caused.

Sadly as Fletcher points out the Vietnam War never really had any military objective. The theory was that if South Vietnam fell this would cause a domino effect and the dreaded communism monster would start gobbling up the entire world. Then when South Vietnam did fall with no domino effect the Secret Team published a group of documents called the Pentagon Papers. These documents deflected blame away from the CIA and said nobody listened to the CIA when they warned that the Vietnam situation was not winnable.

But it wouldn't matter if anyone listened to the Secret Team anyway because they always lie.

This book presents an American government in chaos during the Vietnam era. It was a government that had been high jacked by the evil Secret Team.

After the Bay Of Pigs incident Fidel Castro apparently got fed up with the CIA and America in general. Castro turned to the Soviet Union instead. This lead to the Cuban Missile Crisis. It was only in the last 10 years or so that people realized just how close the world came to an all out nuclear exchange at that time.

This was a very dangerous game master craftsman Allan Dulles and his other liars were playing. They were like kids starting fires all over the place in a big field and then just sitting back and seeing which of those fires would become an inferno as Vietnam did.

Also in recent years people have implicated Lyndon Johnson as being part of the conspiracy to assassination JFK. So LBJ was on the team also.

I'm not sure if Fletcher ever really spells out what the true motivations of the Secret Team were but he hints at it. Probably the three main reasons that people engage in criminal activity are sex, money, and revenge. Usually when crimes are committed there's a money trail somewhere. And in the case of government military spending that's a very long trail.

This is a serious book which contains many details about an approximately 25 year period that began after World War II. It is not light reading.

Watch this documentary series on the internet. The hypocrites have pulled it off the market:

The Men Who Killed Kennedy

The Men Who Killed Kennedy DVD Series - Episode List

1. "The Coup D'Etat" (25 October 1988)
2. "The Forces Of Darkness" (25 October 1988)
3. "The Cover-Up" (20 November 1991)
4. "The Patsy" (21 November 1991)
5. "The Witnesses" (21 November 1991)
6. "The Truth Shall Set You Free" (1995)

The Final Chapter episodes (internet only):

7. "The Smoking Guns" (2003)
8. "The Love Affair" (2003)
9. "The Guilty Men" (2003)

Herman , February 4, 2017
Extensive analysis of the CIA from its inception to the 1970's

The fact that this book all but disappeared when it was distributed in the 1970's tells all that the CIA did not want any of its "dirty laundry" aired in public. Prouty does an excellent (almost over the top) job of describing the rise and strategies and evolution of the CIA up through the 70's. That the Vietnam War was still controlled by the CIA at the writing of the original book also shows JFK had not gained control of the military-industrial complex. For those who are wanting to fill in more pieces of the puzzle this is an excellent source from a man who found himself in the thick of things for many years. The one shot-coming comes in the last chapter in his description of Nixon and especially LBJ not being able to control the military industrial complex either.

Consequent independent research over many years seems to show LBJ who was about to go to jail and be dropped from the 1964 ticket, knew about and helped cover up the JFK assassination and is known to have remarked: "Just get me elected and you can have your damn war".

There is also evidence Nixon and company undermined the 1968 peace talks as LBJ was trying to end the war and LBJ actually called Nixon and asked him to back off. ( Kinda like the Oct 1980 surprise by Reagan). Consequently we know from Judyth Vary Baker that Lee Oswald was the the assassin of JFK and he in fact was on the payroll of the FBI and CIA.

James E Files has confessed to being one of the shooters and E. Howard Hunt told his son, he was involved and he was CIA at the time. But no One man can possibly know everything. Given the pervasive infiltration of government, military and probably many civil institutions by the CIA, one wonders who comprises the shadow government in reality?

Boyce Hart , July 22, 2010
The Critical Sinews btw CIA and other Gov. Agencies

What does it mean when we say " the CIA did such and such an action"? Just what is the CIA, a whole or a part? Given its emphasis on compartmentalization, is it accurate to say "the CIA was heavily involved in the JFK assassination" or would it be more accurate to say parts of the CIA were? Moreover, who is the CIA, and what are the powers behind it? Also, perhaps most importantly, what were the relations between the CIA and other parts of government, and how and when did these relationships change and evolve. Were these changes done democratically or secretly. These last two questions are the essence of this book. Yes, it is true as one reviewer noted, this book could have used an editor. Some times it has the feel of a collection of speeches, but not always. So why the five instead of 4. The subject matter-- in particular the last two questions typed above-- are just too rarely mentioned and discussed. This book really helps us understand the curiously evolving nervous system of the CIA btw 1947 and 1963, as very very few other books do. It sees the inception of the CIA in 1947 as just the first step, and makes it clear that later developments were neither willed nor pre-ordained by many of the elected officials who wrote the National Security Act of 1947.

The only other book that really addresses this BETWEEN WORLD--i.e. between CIA and other government agencies is one of the Three most important books published in the last 50 years IMO. Thy Will Be Done: Nelson Rockefeller, Evangelism, and the Conquest of the Amazon In the Age of OIl by Colby and Dennett. Thy Will Be Done: The Conquest of the Amazon : Nelson Rockefeller and Evangelism in the Age of Oil

Still there is one book I recommend even more than that one. This is not the current Gold Standard merely for all current JFK research. It is far more than that; it is the Gold Standard for all US Cold War History Research. JFK and the Unspeakable: Why He Died and Why It Matters by James W. Douglass. This book is so important because it is not merely who done it but why done it. It is a book that mixes how and why of JFK and those crucial-because-contestable Cold War years 1960-63 like no other. JFK and the Unspeakable: Why He Died and Why It Matters

Luc REYNAERT , November 30, 2008
A symbol of sinister and mysterious foreign intrigue (H. Truman)

This is an extremely important book. The proof of it is that even the official copy in the Library of Congress disappeared (!). Moreover, even after his death, the author continues to be the object of a smear campaign (see internet).

His book is not less than a frontal attack on US intelligence and concomitantly on those who control it.
Its portrait of Allen Dulles, a longtime intelligence director, says it all: `I am a lawyer'; in other words, a servant. But of whom?
This book unveils the existence of a secret cabal, a Power Elite (G. William Domhoff), a `deep State' (P.D. Scott) within the US and its government as well as in about 40 host countries.
This Power Elite uses the Secret Team of top intelligence and military commanders as its long arm and protects it. Together they stand above the law and the democratic process. They get things done, whether they have the political authorization or not.
They dispose of a vast undercover political, military, intelligence, business, media and academic infrastructure, in the US as well as worldwide. They don't respect the nation State and are able to create, to influence and to topple governments in the hemisphere controlled by them.

The author gives a remarkable insight into the inner workings, the logistics, the strategies and the tactics of the intelligence agency. Its creation and history show that President H. Truman never intended to create an autonomous operational agency in the clandestine field. L.F. Prouty also gives valuable information about the U2- G. Powers incident (apparently to torpedo the US/USSR peace talks) and the Pentagon papers (an intelligence whitewash).

At the end, the author poses the all important question: `Can any President ever be strong enough really to rule?'

This book is a must read for all those interested in US history and for all those who want to understand the world we live in.

For more information on the Power Elite, I recommend the works of O. Tunander, D. Estulin, Peter Dale Scott, Carroll Quigley, Gary Allen and G. W. Domhoff.

anarchteacher , April 30, 2008
An Insider's Candid Expose' of the National Security State

As in the case of the brilliant Jules Archer volume, The Plot To Seize The White House, it is terrific to have this masterful study of the inner workings of the early CIA back in print after so many years of unavailability.

Skyhorse Publishing is to be commended in seeing to it that both of these crucial works are again available to the attentive reading public who want to know the truth concerning our dark hidden history that the government has so actively strived to keep buried.

The late Colonel L. Fletcher Prouty served as chief of special operations for the Joint Chiefs of Staff where he was in charge of the global system designed to provide military support for covert activities of the Central Intelligence Agency.

In Oliver Stone's highly acclaimed film on the assassination of President John Fitzgerald Kennedy, JFK, the mysterious character "X" portrayed by Donald Sutherland was in fact Colonel Prouty, who assisted director Stone in the production and scripting of this historical epic. Prouty had relayed the shocking information detailed in the movie to the actual New Orleans District Attorney Jim Garrison, played by Kevin Cosner, in a series of communiques.

The Secret Team was first published in 1973 during the Watergate scandal, when many Americans were first learning about the dark side of covert government, an outlaw executive branch headed by a renegade chief of state. Richard Nixon would not be the last of this foul breed.

This was years before Frank Church's Senate Committee's damning revelations of CIA misdeeds and assassination plots against foreign leaders rocked the nation.

In each chapter in his book, Prouty speaks frankly with an insiders knowledge of what he describes as the inner workings of "the Secret Team."

This prudential judgment and keen assessment of the National Security Establishment was gained from years as a behind-the-scenes seasoned professional in military intelligence working intimately with those of the highest rank in policy making and implimentation.

The important story Prouty boldly tells should be read by every reflective American.

SER , December 6, 2001
Best Book On CIA Misdeeds

The author was the liason officer between the CIA and the military during the 50's and 60's. As an air force officer (Colonel), he was excempt from taking the CIA oath of secrecy and therefore was in a position to write the book in 1973. Apparently, shortly after the book's publication, almost all copies disappeared, probably bought up by the CIA. I was lucky to find a copy, published in Taiwan (Imperial Books & Records), in a used bookstore several years ago. The author details not only how the CIA conducts its operations, but more importantly, how it manages to keep most or all of its deeds from the eyes of congress, the population and even the President, if necessary. This is the best book I've read on the secret workings of the CIA and its misdeeds during the 50' and early 60's. Not to belittle them, but The Secret Team is a far more informative book than Marchetti and Marks' The CIA And The Cult Of Intelligence....

added, Jan09:

Actually, practically ever since I posted the review, I've been wanting to write a more detailed one, but since it's now been some 20 years since I read the book, I can't remember enough details to do it justice. If I ever reread it, I'll be sure to post a better review. I frankly think my present "review" isn't much of one - and it was cut short after my reference to the Marchetti/Marks book, the linking to which was not allowed at the time.

For example, one item of considerable current interest which I remember from the book is the author's detailing of Operation Northwoods, from the early 1960's - the plan by the intelligence agencies to conduct a false flag attack against American interests and blame it on Cuba, in order to justify a war against that country.
There was a big deal made about this (deservedly, in my opinion), only four or five years ago, when the National Security Archive (an apparently independent non-governmental research institute at George Washington University) discovered the details of this proposed operation, supposedly for the first time, in declassified documents. (This was in light of the ongoing conspiratorial controversies surrounding the 9-11 events.)
Yet, author Prouty detailed Operation Northwoods in his The Secret Team, first published long ago in 1973.
This is but one detail that indicates a much-needed elaborate review of this book.

I'd like to also add (since it is now apparently allowed) that The Secret Team, among other items, is available on CD from the L. Fletcher Prouty Reference Site: http://www.prouty.org/

Finally, for readers still obsessed with the JFK assassination, I would like to recommend Final Judgment - The Missing Link in the JFK Assassination Conspiracy, by Michael Collins Piper, a book which lives up to it's title. My use of the word "obsessed" is not meant derogatorily, as I have my own bookshelf-full as testament to that particular subject, but as an inducement to read the book, which will make the big picture very clear indeed. Do yourselves the favor.

Last edit: Jan09

Michael Tozer , July 7, 2006
Great!

Colonel Prouty's book on the Secret Team should be required reading for all concerned Americans. Herein, the author, a retired Air Force Colonel and CIA insider, reveals for all to see the machinations of the Secret Team and their impact on US history in the post World War II era. This is terribly important information.

I was particularly impressed with Prouty's depiction of Eisenhower's peace initiative and how it was sabatoged by the Secret Team. Ike was preparing for his peace summit with Kruschev when Gary Powers was sent off on his fool's errand on April 30th, 1960, a date with significant occult emblematics. The capture of Powers by the Soviets effectively scuttled the Eisenhower peace plan, which would have ruined the plans of the Secret Team, for continued Cold War tension, and treasure for the merchants of venom.

The essential truths in this important book are still relevant today. Of course, the ineffectual George Walker Bush is not entirely in charge of American foreign policy in this critical time. He is certainly still being manipulated by the sucessors of the Secret Team depicted in this excellent and well written book. Any serious student of American foreign policy in the post World War II era ought to read this important book.

D. E. Tench , May 24, 2013
Conspiracy History - not Theory!

The Colonel's book contains valuable and legitimate insider information about how factions within our government have been dishonest, selfish, and ruthlessly brutal for a very long time now. He shows the reader more than one vivid picture of how our American Presidents are routinely hoodwinked and manipulated by CIA moles - covert operators who often work within the D.C. Beltway.

I only wish he had expanded on the following statement (from page 15 of the 1973 edition): "There were and are many men who are not in government who are prime movers of Secret Team activity." Perhaps he knew enough to mention their connection to and influence over the Agency, but not enough to elaborate upon it. Or perhaps he knew better than to push that topic too far if he wanted to get published. In 1973 there were no on-demand self-publishing formats like what we have available to us today.

Prouty also mentions the non-governmental elements of secrecy in Chapter 23, but it's closer to a defining of terms than an elaboration. He ends the book with a view of the Secret Team as an evolved and faceless mechanism serving the cause of anti-Communism. Today, the enemy du jour is anti-terrorism. However, I argue that secret teams are never faceless, but made up of individuals.

The Secret Team that Col. Prouty revealed was part of a larger Secret Team. My book: "Know Your Enemy: Exposing Satan's Human Army" discusses the spiritual state of secretive operators and some of what scripture reveals on the topic.

[Apr 04, 2019] Fascism A Warning by Madeleine Albright

Junk author, junk book of the butcher of Yugoslavia who would be hanged with Bill clinton by Nuremberg Tribunal for crimes against peace. Albright is not bright at all. she a female bully and that shows.
Mostly projection. And this arrogant warmonger like to exercise in Russophobia (which was the main part of the USSR which saved the world fro fascism, sacrificing around 20 million people) This book is book of denial of genocide against Iraqis and Serbian population where bombing with uranium enriched bombs doubled cancer cases.If you can pass over those facts that this book is for you.
Like Robert Kagan and other neocons Albright is waiving authoritarism dead chicken again and again. that's silly and disingenuous. authoritarism is a method of Governance used in military. It is not an ideology. Fascism is an ideology, a flavor of far right nationalism. Kind of "enhanced" by some socialist ideas far right nationalism.
The view of fascism without economic circumstances that create fascism, and first of immiseration of middle and working class and high level of unemployment is a primitive ahistorical view. Fascism is the ultimate capitalist statism acting simultaneously as the civil religion for the population also enforced by the power of the state. It has a lot of common with neoliberalism, that's why neoliberalism is sometimes called "inverted totalitarism".
In reality fascism while remaining the dictatorship of capitalists for capitalist and the national part of financial oligarchy, it like neoliberalism directed against working class fascism comes to power on the populist slogans of righting wrong by previous regime and kicking foreign capitalists and national compradors (which in Germany turned to be mostly Jewish) out.
It comes to power under the slogans of stopping the distribution of wealth up and elimination of the class of reinters -- all citizens should earn income, not get it from bond and other investments (often in reality doing completely the opposite).
While intrinsically connected and financed by a sizable part of national elite which often consist of far right military leadership, a part of financial oligarchy and large part of lower middle class (small properties) is is a protest movement which want to revenge for the humiliation and prefer military style organization of the society to democracy as more potent weapon to achieve this goal.
Like any far right movement the rise of fascism and neo-fascism is a sign of internal problem within a given society, often a threat to the state or social order.
Apr 04, 2019 | www.amazon.com

Still another noted that Fascism is often linked to people who are part of a distinct ethnic or racial group, who are under economic stress, and who feel that they are being denied rewards to which they are entitled. "It's not so much what people have." she said, "but what they think they should have -- and what they fear." Fear is why Fascism's emotional reach can extend to all levels of society. No political movement can flourish without popular support, but Fascism is as dependent on the wealthy and powerful as it is on the man or woman in the street -- on those who have much to lose and those who have nothing at all.

This insight made us think that Fascism should perhaps be viewed less as a political ideology than as a means for seizing and holding power. For example, Italy in the 1920s included self-described Fascists of the left (who advocated a dictatorship of the dispossessed), of the right (who argued for an authoritarian corporatist state), and of the center (who sought a return to absolute monarchy). The German National Socialist Party (the

Nazis) originally came together ar ound a list of demands that ca- tered to anti-Semites, anti-immigrants, and anti-capitalists but also advocated for higher old-age pensions, more educational op- portunities for the poor, an end to child labor, and improved ma- ternal health care. The Nazis were racists and, in their own minds, reformers at the same time.

If Fascism concerns itself less with specific policies than with finding a pathway to power, what about the tactics of lead- ership? My students remarked that the Fascist chiefs we remem- ber best were charismatic. Through one method or another, each established an emotional link to the crowd and, like the central figure in a cult, brought deep and often ugly feelings to the sur- face. This is how the tentacles of Fascism spread inside a democ- racy. Unlike a monarchy or a military dictatorship imposed on society from above. Fascism draws energy from men and women who are upset because of a lost war, a lost job, a memory of hu- miliation, or a sense that their country is in steep decline. The more painful the grounds for resentment, the easier it is for a Fascist leader to gam followers by dangling the prospect of re- newal or by vowing to take back what has been stolen.

Like the mobilizers of more benign movements, these secular evangelists exploit the near-universal human desire to be part of a meaningful quest. The more gifted among them have an apti- tude for spectacle -- for orchestrating mass gatherings complete with martial music, incendiary rhetoric, loud cheers, and arm-

lifting salutes. To loyalists, they offer the prize of membership in a club from which others, often the objects of ridicule, are kept out. To build fervor, Fascists tend to be aggressive, militaristic, and -- when circumstances allow -- expansionist. To secure the future, they turn schools into seminaries for true believers, striv- ing to produce "new men" and "new women" who will obey without question or pause. And, as one of my students observed, "a Fascist who launches his career by being voted into office will have a claim to legitimacy that others do not."

After climbing into a position of power, what comes next: How does a Fascist consolidate authority? Here several students piped up: "By controlling information." Added another, "And that's one reason we have so much cause to worry today." Most of us have thought of the technological revolution primarily as a means for people from different walks of life to connect with one another, trade ideas, and develop a keener understanding of why men and women act as they do -- in other words, to sharpen our perceptions of truth. That's still the case, but now we are not so sure. There is a troubling "Big Brother" angle because of the mountain of personal data being uploaded into social media. If an advertiser can use that information to home in on a consumer because of his or her individual interests, what's to stop a Fascist government from doing the same? "Suppose I go to a demonstra- tion like the Women's March," said a student, "and post a photo

on social media. My name gets added to a list and that list can end up anywhere. How do we protect ourselves against that?"

Even more disturbing is the ability shown by rogue regimes and their agents to spread lies on phony websites and Facebook. Further, technology has made it possible for extremist organiza- tions to construct echo chambers of support for conspiracy theo- ries, false narratives, and ignorant views on religion and race. This is the first rule of deception: repeated often enough, almost any statement, story, or smear can start to sound plausible. The Internet should be an ally of freedom and a gateway to knowledge; in some cases, it is neither.

Historian Robert Paxton begins one of his books by assert- ing: "Fascism was the major political innovation of the twentieth century, and the source of much of its pain." Over the years, he and other scholars have developed lists of the many moving parts that Fascism entails. Toward the end of our discussion, my class sought to articulate a comparable list.

Fascism, most of the students agreed, is an extreme form of authoritarian rule. Citizens are required to do exactly what lead- ers say they must do, nothing more, nothing less. The doctrine is linked to rabid nationalism. It also turns the traditional social contract upside down. Instead of citizens giving power to the state in exchange for the protection of their rights, power begins with the leader, and the people have no rights. Under Fascism,

the mission of citizens is to serve; the government's job is to rule.

When one talks about this subject, confusion often arises about the difference between Fascism and such related concepts as totalitarianism, dictatorship, despotism, tyranny, autocracy, and so on. As an academic, I might be tempted to wander into that thicket, but as a former diplomat, I am primarily concerned with actions, not labels. To my mind, a Fascist is someone who identifies strongly with and claims to speak for a whole nation or group, is unconcerned with the rights of others, and is willing to use whatever means are necessary -- including violence -- to achieve his or her goals. In that conception, a Fascist will likely be a tyrant, but a tyrant need not be a Fascist.

Often the difference can be seen in who is trusted with the guns. In seventeenth-century Europe, when Catholic aristocrats did battle with Protestant aristocrats, they fought over scripture but agreed not to distribute weapons to their peasants, thinking it safer to wage war with mercenary armies. Modern dictators also tend to be wary of their citizens, which is why they create royal guards and other elite security units to ensure their personal safe- ty. A Fascist, however, expects the crowd to have his back. Where kings try to settle people down, Fascists stir them up so that when the fighting begins, their foot soldiers have the will and the firepower to strike first.


petarsimic , October 21, 2018

Madeleine Albright on million Iraqis dead: "We think the price is worth It"

Hypocrisy at its worst from a lady who advocated hawkish foreign policy which included the most sustained bombing campaign since Vietnam, when, in 1998, Clinton began almost daily attacks on Iraq in the so-called no-fly zones, and made so-called regime change in Iraq official U.S. policy.

In May of 1996, 60 Minutes aired an interview with Madeleine Albright, who at the time was Clinton's U.N. ambassador. Correspondent Leslie Stahl said to Albright, in connection with the Clinton administration presiding over the most devastating regime of sanctions in history that the U.N. estimated took the lives of as many as a million Iraqis, the vast majority of them children. , "We have heard that a half-million children have died. I mean, that's more children than died in Hiroshima. And -- and, you know, is the price worth it?"

Madeleine Albright replied, "I think this is a very hard choice, but the price -- we think the price is worth it.

<img src="https://images-na.ssl-images-amazon.com/images/S/amazon-avatars-global/default._CR0,0,1024,1024_SX48_.png"> P. Bierre , June 11, 2018
Does Albright present a comprehensive enough understanding of fascism to instruct on how best to avoid it?

While I found much of the story-telling in "Fascism" engaging, I come away expecting much more of one of our nation's pre-eminent senior diplomats . In a nutshell, she has devoted a whole volume to describing the ascent of intolerant fascism and its many faces, but punted on the question "How should we thwart fascism going forward?"

Even that question leaves me a bit unsatisfied, since it is couched in double-negative syntax. The thing there is an appetite for, among the readers of this book who are looking for more than hand-wringing about neofascism, is a unifying title or phrase which captures in single-positive syntax that which Albright prefers over fascism. What would that be? And, how do we pursue it, nurture it, spread it and secure it going forward? What is it?

I think Albright would perhaps be willing to rally around "Good Government" as the theme her book skirts tangentially from the dark periphery of fascistic government. "Virtuous Government"? "Effective Government"? "Responsive Government"?

People concerned about neofascism want to know what we should be doing right now to avoid getting sidetracked into a dark alley of future history comparable to the Nazi brown shirt or Mussolini black shirt epochs. Does Albright present a comprehensive enough understanding of fascism to instruct on how best to avoid it? Or, is this just another hand-wringing exercise, a la "you'll know it when you see it", with a proactive superficiality stuck at the level of pejorative labelling of current styles of government and national leaders? If all you can say is what you don't want, then the challenge of threading the political future of the US is left unruddered. To make an analogy to driving a car, if you don't know your destination, and only can get navigational prompts such as "don't turn here" or "don't go down that street", then what are the chances of arriving at a purposive destination?

The other part of this book I find off-putting is that Albright, though having served as Secretary of State, never talks about the heavy burden of responsibility that falls on a head of state. She doesn't seem to empathize at all with the challenge of top leadership. Her perspective is that of the detached critic. For instance, in discussing President Duterte of the Philippines, she fails to paint the dire situation under which he rose to national leadership responsibility: Islamic separatists having violently taken over the entire city of Marawi, nor the ubiquitous spread of drug cartel power to the level where control over law enforcement was already ceded to the gangs in many places...entire islands and city neighborhoods run by mafia organizations. It's easy to sit back and criticize Duterte's unleashing of vigilante justice -- What was Mrs. Albright's better alternative to regain ground from vicious, well-armed criminal organizations? The distancing from leadership responsibility makes Albright's treatment of the Philippines twin crises of gang-rule and Islamist revolutionaries seem like so much academic navel-gazing....OK for an undergrad course at Georgetown maybe, but unworthy of someone who served in a position of high responsibility. Duterte is liked in the Philippines. What he did snapped back the power of the cartels, and returned a deserved sense of security to average Philippinos (at least those not involved with narcotics). Is that not good government, given the horrendous circumstances Duterte came up to deal with? What lack of responsibility in former Philippine leadership allowed things to get so out of control? Is it possible that Democrats and liberals are afraid to be tough, when toughness is what is needed? I'd much rather read an account from an average Philippino about the positive impacts of the vigilante campaign, than listen of Madame Secretary sermonizing out of context about Duterte. OK, he's not your idea of a nice guy. Would you rather sit back, prattle on about the rule of law and due process while Islamic terrorists wrest control over where you live? Would you prefer the leadership of a drug cartel boss to Duterte?

My critique is offered in a constructive manner. I would certainly encourage Albright (or anyone!) to write a book in a positive voice about what it's going to take to have good national government in the US going forward, and to help spread such abundance globally. I would define "good" as the capability to make consistently good policy decisions, ones that continue to look good in hindsight, 10, 20 or 30 years later. What does that take?

I would submit that the essential "preserving democracy" process component is having a population that is adequately prepared for collaborative problem-solving. Some understanding of history is helpful, but it's simply not enough. Much more essential is for every young person to experience team problem-solving, in both its cooperative and competitive aspects. Every young person needs to experience a team leadership role, and to appreciate what it takes from leaders to forge constructive design from competing ideas and champions. Only after serving as a referee will a young person understand the limits to "passion" that individual contributors should bring to the party. Only after moderating and herding cats will a young person know how to interact productively with leaders and other contributors. Much of the skill is counter-instinctual. It's knowing how to express ideas...how to field criticism....how to nudge people along in the desired direction...and how to avoid ad-hominem attacks, exaggerations, accusations and speculative grievances. It's learning how to manage conflict productively toward excellence. Way too few of our young people are learning these skills, and way too few of our journalists know how to play a constructive role in managing communications toward successful complex problem-solving. Albright's claim that a journalist's job is primarily to "hold leaders accountable" really betrays an absolving of responsibility for the media as a partner in good government -- it doesn't say whether the media are active players on the problem-solving team (which they have to be for success), or mere spectators with no responsibility for the outcome. If the latter, then journalism becomes an irritant, picking at the scabs over and over, but without any forward progress. When the media takes up a stance as an "opponent" of leadership, you end up with poor problem-solving results....the system is fighting itself instead of making forward progress.

"Fascism" doesn't do nearly enough to promote the teaching of practical civics 101 skills, not just to the kids going into public administration, but to everyone. For, it is in the norms of civility, their ability to be practiced, and their defense against excesses, that fascism (e.g., Antifa) is kept at bay.
Everyone in a democracy has to know the basics:
• when entering a disagreement, don't personalize it
• never demonize an opponent
• keep a focus on the goal of agreement and moving forward
• never tell another person what they think, but ask (non-rhetorically) what they think then be prepared to listen and absorb
• do not speak untruths or exaggerate to make an argument
• do not speculate grievance
• understand truth gathering as a process; detect when certainty is being bluffed; question sources
• recognize impasse and unproductive argumentation and STOP IT
• know how to introduce a referee or moderator to regain productive collaboration
• avoid ad hominem attacks
• don't take things personally that wrankle you;
• give the benefit of the doubt in an ambiguous situation
• don't jump to conclusions
• don't reward theatrical manipulation

These basics of collaborative problem-solving are the guts of a "liberal democracy" that can face down the most complex challenges and dilemmas.

I gave the book 3 stars for the great story-telling, and Albright has been part of a great story of late 20th century history. If she would have told us how to prevent fascism going forward, and how to roll it back in "hard case" countries like North Korea and Sudan, I would have given her a 5. I'm not that interested in picking apart the failure cases of history...they teach mostly negative exemplars. Much rather I would like to read about positive exemplars of great national government -- "great" defined by popular acclaim, by the actual ones governed. Where are we seeing that today? Canada? Australia? Interestingly, both of these positive exemplars have strict immigration policies.

Is it possible that Albright is just unable, by virtue of her narrow escape from Communist Czechoslovakia and acceptance in NYC as a transplant, to see that an optimum immigration policy in the US, something like Canada's or Australia's, is not the looming face of fascism, but rather a move to keep it safely in its corner in coming decades? At least, she admits to her being biased by her life story.

That suggests her views on refugees and illegal immigrants as deserving of unlimited rights to migrate into the US might be the kind of cloaked extremism that she is warning us about.

Anat Hadad , January 19, 2019
"Fascism is not an exception to humanity, but part of it."

Albright's book is a comprehensive look at recent history regarding the rise and fall of fascist leaders; as well as detailing leaders in nations that are starting to mimic fascist ideals. Instead of a neat definition, she uses examples to bolster her thesis of what are essential aspects of fascism. Albright dedicates each section of the book to a leader or regime that enforces fascist values and conveys this to the reader through historical events and exposition while also peppering in details of her time as Secretary of State. The climax (and 'warning'), comes at the end, where Albright applies what she has been discussing to the current state of affairs in the US and abroad.

Overall, I would characterize this as an enjoyable and relatively easy read. I think the biggest strength of this book is how Albright uses history, previous examples of leaders and regimes, to demonstrate what fascism looks like and contributing factors on a national and individual level. I appreciated that she lets these examples speak for themselves of the dangers and subtleties of a fascist society, which made the book more fascinating and less of a textbook. Her brief descriptions of her time as Secretary of State were intriguing and made me more interested in her first book, 'Madame Secretary'. The book does seem a bit slow as it is not until the end that Albright blatantly reveals the relevance of all of the history relayed in the first couple hundred pages. The last few chapters are dedicated to the reveal: the Trump administration and how it has affected global politics. Although, she never outright calls Trump a fascist, instead letting the reader decide based on his decisions and what you have read in the book leading up to this point, her stance is quite clear by the end. I was surprised at what I shared politically with Albright, mainly in immigration and a belief of empathy and understanding for others. However, I got a slight sense of anti-secularism in the form of a disdain for those who do not subscribe to an Abrahamic religion and she seemed to hint at this being partly an opening to fascism.

I also could have done without the both-sides-ism she would occasionally push, which seems to be a tactic used to encourage people to 'unite against Trump'. These are small annoyances I had with the book, my main critique is the view Albright takes on democracy. If anything, the book should have been called "Democracy: the Answer" because that is the most consistent stance Albright takes throughout. She seems to overlook many of the atrocities the US and other nations have committed in the name of democracy and the negative consequences of capitalism, instead, justifying negative actions with the excuse of 'it is for democracy and everyone wants that' and criticizing those who criticize capitalism.

She does not do a good job of conveying the difference between a communist country like Russia and a socialist country like those found in Scandinavia and seems okay with the idea of the reader lumping them all together in a poor light. That being said, I would still recommend this book for anyone's TBR as the message is essential for today, that the current world of political affairs is, at least somewhat, teetering on a precipice and we are in need of as many strong leaders as possible who are willing to uphold democratic ideals on the world stage and mindful constituents who will vote them in.

Matthew T , May 29, 2018
An easy read, but incredibly ignorant and one eyed in far too many instances

The book is very well written, easy to read, and follows a pretty standard formula making it accessible to the average reader. However, it suffers immensely from, what I suspect are, deeply ingrained political biases from the author.

Whilst I don't dispute the criteria the author applies in defining fascism, or the targets she cites as examples, the first bias creeps in here when one realises the examples chosen are traditional easy targets for the US (with the exception of Turkey). The same criteria would define a country like Singapore perfectly as fascist, yet the country (or Malaysia) does not receive a mention in the book.

Further, it grossly glosses over what Ms. Albright terms facist traits from the US governments of the past. If the author is to be believed, the CIA is holier than thou, never intervened anywhere or did anything that wasn't with the best interests of democracy at heart, and American foreign policy has always existed to build friendships and help out their buddies. To someone ingrained in this rhetoric for years I am sure this is an easy pill to swallow, but to the rest of the world it makes a number of assertions in the book come across as incredibly naive. out of 5 stars Trite and opaque

Avid reader , December 20, 2018
Biast much? Still a good start into the problem

We went with my husband to the presentation of this book at UPenn with Albright before it came out and Madeleine's spunk, wit and just glorious brightness almost blinded me. This is a 2.5 star book, because 81 year old author does not really tell you all there is to tell when she opens up on a subject in any particular chapter, especially if it concerns current US interest.

Lets start from the beginning of the book. What really stood out, the missing 3rd Germany ally, Japan and its emperor. Hirohito (1901-1989) was emperor of Japan from 1926 until his death in 1989. He took over at a time of rising democratic sentiment, but his country soon turned toward ultra-nationalism and militarism. During World War II (1939-45), Japan attacked nearly all of its Asian neighbors, allied itself with Nazi Germany and launched a surprise assault on the U.S. naval base at Pearl Harbor, forcing US to enter the war in 1941. Hirohito was never indicted as a war criminal! does he deserve at least a chapter in her book?

Oh and by the way, did author mention anything about sanctions against Germany for invading Austria, Czechoslovakia, Romania and Poland? Up until the Pearl Harbor USA and Germany still traded, although in March 1939, FDR slapped a 25% tariff on all German goods. Like Trump is doing right now to some of US trading partners.

Next monster that deserves a chapter on Genocide in cosmic proportions post WW2 is communist leader of China Mao Zedung. Mr Dikötter, who has been studying Chinese rural history from 1958 to 1962, when the nation was facing a famine, compared the systematic torture, brutality, starvation and killing of Chinese peasants compares to the Second World War in its magnitude. At least 45 million people were worked, starved or beaten to death in China over these four years; the total worldwide death toll of the Second World War was 55 million.

We learn that Argentina has given sanctuary to Nazi war criminals, but she forgets to mention that 88 Nazi scientists arrived in the United States in 1945 and were promptly put to work. For example, Wernher von Braun was the brains behind the V-2 rocket program, but had intimate knowledge of what was going on in the concentration camps. Von Braun himself hand-picked people from horrific places, including Buchenwald concentration camp. Tsk-Tsk Madeline.

What else? Oh, lets just say that like Madelaine Albright my husband is Jewish and lost extensive family to Holocoust. Ukrainian nationalists executed his great grandfather on gistapo orders, his great grandmother disappeared in concentration camp, grandfather was conscripted in june 1940 and decommissioned september 1945 and went through war as infantryman through 3 fronts earning several medals. his grandmother, an ukrainian born jew was a doctor in a military hospital in Saint Petersburg survived famine and saved several children during blockade. So unlike Maideline who was raised as a Roman Catholic, my husband grew up in a quiet jewish family in that territory that Stalin grabbed from Poland in 1939, in a polish turn ukrainian city called Lvov(Lemberg). His family also had to ask for an asylum, only they had to escape their home in Ukraine in 1991. He was told then "You are a nice little Zid (Jew), we will kill you last" If you think things in ukraine changed, think again, few weeks ago in Kiev Roma gypsies were killed and injured during pogroms, and nobody despite witnesses went to jail. Also during demonstrations openly on the streets C14 unit is waving swastikas and Heils. Why is is not mentioned anywhere in the book? is is because Hunter Biden sits on the board of one of Ukraine's largest natural gas companies called Burisma since May 14, 2014, and Ukraine has an estimated 127.9 trillion cubic feet of unproved technically recoverable shale gas resources? ( according to the U.S. Energy Information Administration (EIA).1 The most promising shale reserves appear to be in the Carpathian Foreland Basin (also called the Lviv-Volyn Basin), which extends across Western Ukraine from Poland into Romania, and the Dnieper-Donets Basin in the East (which borders Russia).
Wow, i bet you did not know that. how ugly are politics, even this book that could have been so much greater if the author told the whole ugly story. And how scary that there are countries where you can go and openly be fascist.

&amp;amp;amp;amp;lt;img src="https://images-na.ssl-images-amazon.com/images/S/amazon-avatars-global/0e64e0cb-01e4-4e58-bcae-bba690344095._CR0,0.0,333,333_SX48_.jpg"&amp;amp;amp;amp;gt; NJ , February 3, 2019
Interesting...yes. Useful...hmmm

To me, Fascism fails for the single reason that no two fascist leaders are alike. Learning about one or a few, in a highly cursory fashion like in this book or in great detail, is unlikely to provide one with any answers on how to prevent the rise of another or fend against some such. And, as much as we are witnessing the rise of numerous democratic or quasi-democratic "strongmen" around the world in global politics, it is difficult to brand any of them as fascist in the orthodox sense.

As the author writes at the outset, it is difficult to separate a fascist from a tyrant or a dictator. A fascist is a majoritarian who rouses a large group under some national, racial or similar flag with rallying cries demanding suppression or exculcation of those excluded from this group. A typical fascist leader loves her yes-men and hates those who disagree: she does not mind using violence to suppress dissidents. A fascist has no qualms using propaganda to popularize the agreeable "facts" and theories while debunking the inconvenient as lies. What is not discussed explicitly in the book are perhaps some positive traits that separate fascists from other types of tyrants: fascists are rarely lazy, stupid or prone to doing things for only personal gains. They differ from the benevolent dictators for their record of using heavy oppression against their dissidents. Fascists, like all dictators, change rules to suit themselves, take control of state organizations to exercise total control and use "our class is the greatest" and "kick others" to fuel their programs.

Despite such a detailed list, each fascist is different from each other. There is little that even Ms Albright's fascists - from Mussolini and Hitler to Stalin to the Kims to Chavez or Erdogan - have in common. In fact, most of the opponents of some of these dictators/leaders would calll them by many other choice words but not fascists. The circumstances that gave rise to these leaders were highly different and so were their rules, methods and achievements.

The point, once again, is that none of the strongmen leaders around the world could be easily categorized as fascists. Or even if they do, assigning them with such a tag and learning about some other such leaders is unlikely to help. The history discussed in the book is interesting but disjointed, perfunctory and simplistic. Ms Albright's selection is also debatable.

Strong leaders who suppress those they deem as opponents have wreaked immense harms and are a threat to all civil societies. They come in more shades and colours than terms we have in our vocabulary (dictators, tyrants, fascists, despots, autocrats etc). A study of such tyrant is needed for anyone with an interest in history, politics, or societal well-being. Despite Ms Albright's phenomenal knowledge, experience, credentials, personal history and intentions, this book is perhaps not the best place to objectively learn much about the risks from the type of things some current leaders are doing or deeming as right.

Gderf , February 15, 2019
Wrong warning

Each time I get concerned about Trump's rhetoric or past actions I read idiotic opinions, like those of our second worst ever Secretary of State, and come to appreciate him more. Pejorative terms like fascism or populism have no place in a rational policy discussion. Both are blatant attempts to apply a pejorative to any disagreeing opinion. More than half of the book is fluffed with background of Albright, Hitler and Mussolini. Wikipedia is more informative. The rest has snippets of more modern dictators, many of whom are either socialists or attained power through a reaction to failed socialism, as did Hitler. She squirms mightily to liken Trump to Hitler. It's much easier to see that Sanders is like Maduro. The USA is following a path more like Venezuela than Germany.

Her history misses that Mussolini was a socialist before he was a fascist, and Nazism in Germany was a reaction to Wiemar socialism. The danger of fascism in the US is far greater from the left than from the right. America is far left of where the USSR ever was. Remember than Marx observed that Russia was not ready for a proletarian revolution. The USA with ready made capitalism for reform fits Marx's pattern much better. Progressives deny that Sanders and Warren are socialists. If not they are what Lenin called "useful idiots."
Albright says that she is proud of the speech where she called the USA the 'Indispensable Nation.' She should be ashamed. Obama followed in his inaugural address, saying that we are "the indispensable nation, responsible for world security." That turned into a policy of human rights interventions leading to open ended wars (Syria, Yemen), nations in chaos (Libya), and distrust of the USA (Egypt, Russia, Turkey, Tunisia, Israel, NK). Trump now has to make nice with dictators to allay their fears that we are out to replace them.
She admires the good intentions of human rights intervention, ignoring the results. She says Obama had some success without citing a single instance. He has apologized for Libya, but needs many more apologies. She says Obama foreign policy has had some success, with no mention of a single instance. Like many progressives, she confuses good intentions with performance. Democracy spreading by well intentioned humanitarian intervention has resulted in a succession of open ended war or anarchy.

The shorter histories of Czechoslovakia, Yugoslavia and Venezuela are much more informative, although more a warning against socialism than right wing fascism. Viktor Orban in Hungary is another reaction to socialism.

Albright ends the book with a forlorn hope that we need a Lincoln or Mandela, exactly what our two party dictatorship will not generate as it yields ever worse and worse candidates for our democracy to vote upon, even as our great society utopia generates ever more power for weak presidents to spend our money and continue wrong headed foreign policy.

The greatest danger to the USA is not fascism, but of excessively poor leadership continuing our slow slide to the bottom.

[Apr 02, 2019] Mr Cohen and I Live on Different Planets

Apr 02, 2019 | www.amazon.com

Looks like the reviewer is a typical neocon. Typical neocon think tanks talking points that reflect the "Full Spectrum Dominance" agenda.

As for Ukraine yes, of course, Victoria Nuland did not interfere with the event, did not push for the deposing Yanukovich to spoil agreement reached between him and the EU diplomats ("F**k EU" as this high level US diplomat eloquently expressed herself) and to appoint the US stooge Yatsenyuk. The transcript of Nuland's phone call actually introduced many Americans to the previously obscure Yatsenyuk.

And the large amount of cash confiscated in the Kiev office of Yulia Timostchenko Batkivshchina party (the main opposition party at the time run by Yatsenyuk, as Timoshenko was in jail) was just a hallucination. It has nothing to do with ";bombing with dollars"; -- yet another typical color revolution trick.

BTW "government snipers of rooftops" also is a standard false flag operation used to instill uprising at the critical moment of the color revolution. Ukraine was not the first and is not the last. One participant recently confessed. The key person in this false flag operation was the opposition leader Andriy Parubiy -- who was responsible for the security of the opposition camp. Google "Parubiy and snipergate" for more information.

His view on DNC hack (which most probably was a leak) also does not withstand close scrutiny. William Binney, a former National Security Agency high level official who co-authored an analysis of a group of former intelligence professionals thinks that this was a transfer to the local USB drive as the speed of downloads was too high for the Internet connection. In this light the death of Seth Rich looks very suspicious indeed.

As for Russiagate, he now needs to print his review and the portrait of Grand Wizard of Russiagate Rachel Maddow, shed both of them and eat with Borscht ;-)

[Apr 01, 2019] Amazon.com War with Russia From Putin Ukraine to Trump Russiagate (9781510745810) Stephen F. Cohen Books

Highly recommended!
Important book. Kindle sample
Notable quotes:
"... Washington has made many policies strongly influenced by' the demonizing of Putin -- a personal vilification far exceeding any ever applied to Soviet Russia's latter-day Communist leaders. ..."
"... As with all institutions, the demonization of Putin has its own history'. When he first appeared on the world scene as Boris Yeltsin's anointed successor, in 1999-2000, Putin was welcomed by' leading representatives of the US political-media establishment. The New York Times ' chief Moscow correspondent and other verifiers reported that Russia's new leader had an "emotional commitment to building a strong democracy." Two years later, President George W. Bush lauded his summit with Putin and "the beginning of a very' constructive relationship."' ..."
"... But the Putin-friendly narrative soon gave away to unrelenting Putin-bashing. In 2004, Times columnist Nicholas Kristof inadvertently explained why, at least partially. Kristof complained bitterly' of having been "suckered by' Mr. Putin. He is not a sober version of Boris Yeltsin." By 2006, a Wall Street Journal editor, expressing the establishment's revised opinion, declared it "time we start thinking of Vladimir Putin's Russia as an enemy of the United States." 10 , 11 The rest, as they' say, is history'. ..."
"... In America and elsewhere in the West, however, only purported "minuses" reckon in the extreme vilifying, or anti-cult, of Putin. Many are substantially uninformed, based on highly selective or unverified sources, and motivated by political grievances, including those of several Yeltsin-era oligarchs and their agents in the West. ..."
"... Putin is not the man who, after coming to power in 2000, "de-democratized" a Russian democracy established by President Boris Yeltsin in the 1990s and restored a system akin to Soviet "totalitarianism." ..."
"... Nor did Putim then make himself a tsar or Soviet-like autocrat, which means a despot with absolute power to turn his will into policy, the last Kremlin leader with that kind of power was Stalin, who died in 1953, and with him his 20-year mass terror. ..."
"... Putin is not a Kremlin leader who "reveres Stalin" and whose "Russia is a gangster shadow of Stalin's Soviet Union." 13 , 14 These assertions are so far-fetched and uninfoimed about Stalin's terror-ridden regime, Putin, and Russia today, they barely warrant comment. ..."
"... Nor did Putin create post-Soviet Russia's "kleptocratic economic system," with its oligarchic and other widespread corruption. This too took shape under Yeltsin during the Kremlin's shock-therapy "privatization" schemes of the 1990s, when the "swindlers and thieves" still denounced by today's opposition actually emerged. ..."
"... Which brings us to the most sinister allegation against him: Putin, trained as "a KGB thug," regularly orders the killing of inconvenient journalists and personal enemies, like a "mafia state boss." ..."
"... More recently, there is yet another allegation: Putin is a fascist and white supremacist. The accusation is made mostly, it seems, by people wishing to deflect attention from the role being played by neo-Nazis in US-backed Ukraine. ..."
"... Finally, at least for now. there is the ramifying demonization allegation that, as a foreign-policy leader. Putin has been exceedingly "aggressive" abroad and his behavior has been the sole cause of the new cold war. ..."
"... Embedded in the "aggressive Putin" axiom are two others. One is that Putin is a neo-Soviet leader who seeks to restore the Soviet Union at the expense of Russia's neighbors. Fie is obsessively misquoted as having said, in 2005, "The collapse of the Soviet Union was the greatest geopolitical catastrophe of the twentieth century," apparently ranking it above two World Wars. What he actually said was "a major geopolitical catastrophe of the twentieth century," as it was for most Russians. ..."
"... The other fallacious sub-axiom is that Putin has always been "anti-Western," specifically "anti-American," has "always viewed the United States" with "smoldering suspicions." -- so much that eventually he set into motion a "Plot Against America." ..."
"... Or, until he finally concluded that Russia would never be treated as an equal and that NATO had encroached too close, Putin was a full partner in the US-European clubs of major world leaders? Indeed, as late as May 2018, contrary to Russiagate allegations, he still hoped, as he had from the beginning, to rebuild Russia partly through economic partnerships with the West: "To attract capital from friendly companies and countries, we need good relations with Europe and with the whole world, including the United States." 3 " ..."
"... A few years earlier, Putin remarkably admitted that initially he had "illusions" about foreign policy, without specifying which. Perhaps he meant this, spoken at the end of 2017: "Our most serious mistake in relations with the West is that we trusted you too much. And your mistake is that you took that trust as weakness and abused it." 34 ..."
"... <img src="https://images-na.ssl-images-amazon.com/images/S/amazon-avatars-global/default._CR0,0,1024,1024_SX48_.png"> P. Philips ..."
"... "In a Time of Universal Deceit -- Telling the Truth Is a Revolutionary Act" ..."
"... Professor Cohen is indeed a patriot of the highest order. The American and "Globalists" elites, particularly the dysfunctional United Kingdom, are engaging in a war of nerves with Russia. This war, which could turn nuclear for reasons discussed in this important book, is of no benefit to any person or nation. ..."
"... If you are a viewer of one of the legacy media outlets, be it Cable Television networks, with the exception of Tucker Carlson on Fox who has Professor Cohen as a frequent guest, or newspapers such as The New York Times, you have been exposed to falsehoods by remarkably ignorant individuals; ignorant of history, of the true nature of Russia (which defeated the Nazis in Europe at a loss of millions of lives) and most important, of actual military experience. America is neither an invincible or exceptional nation. And for those familiar with terminology of ancient history, it appears the so-called elites are suffering from hubris. ..."
Apr 01, 2019 | www.amazon.com

THE SPECTER OF AN EVIL-DOING VLADIMIR PUTIN HAS loomed over and undermined US thinking about Russia for at least a decade. Inescapably, it is therefore a theme that runs through this book. Henry' Kissinger deserves credit for having warned, perhaps alone among prominent American political figures, against this badly distorted image of Russia's leader since 2000: "The demonization of Vladimir Putin is not a policy. It is an alibi for not having one." 4

But Kissinger was also wrong. Washington has made many policies strongly influenced by' the demonizing of Putin -- a personal vilification far exceeding any ever applied to Soviet Russia's latter-day Communist leaders. Those policies spread from growing complaints in the early 2000s to US- Russian proxy wars in Georgia, Ukraine, Syria, and eventually even at home, in Russiagate allegations. Indeed, policy-makers adopted an earlier formulation by the late Senator .Tolm McCain as an integral part of a new and more dangerous Cold War: "Putin [is] an unreconstructed Russian imperialist and K.G.B. apparatchik.... His world is a brutish, cynical place.... We must prevent the darkness of Mr. Putin's world from befalling more of humanity'." 3

Mainstream media outlets have play'ed a major prosecutorial role in the demonization. Far from aty'pically', the Washington Post's editorial page editor wrote, "Putin likes to make the bodies bounce.... The rule-by-fear is Soviet, but this time there is no ideology -- only a noxious mixture of personal aggrandizement, xenophobia, homophobia and primitive anti-Americanism." 6 Esteemed publications and writers now routinely degrade themselves by competing to denigrate "the flabbily muscled form" of the "small gray ghoul named Vladimir Putin." 7 , 8 There are hundreds of such examples, if not more, over many years. Vilifying Russia's leader has become a canon in the orthodox US narrative of the new Cold War.

As with all institutions, the demonization of Putin has its own history'. When he first appeared on the world scene as Boris Yeltsin's anointed successor, in 1999-2000, Putin was welcomed by' leading representatives of the US political-media establishment. The New York Times ' chief Moscow correspondent and other verifiers reported that Russia's new leader had an "emotional commitment to building a strong democracy." Two years later, President George W. Bush lauded his summit with Putin and "the beginning of a very' constructive relationship."'

But the Putin-friendly narrative soon gave away to unrelenting Putin-bashing. In 2004, Times columnist Nicholas Kristof inadvertently explained why, at least partially. Kristof complained bitterly' of having been "suckered by' Mr. Putin. He is not a sober version of Boris Yeltsin." By 2006, a Wall Street Journal editor, expressing the establishment's revised opinion, declared it "time we start thinking of Vladimir Putin's Russia as an enemy of the United States." 10 , 11 The rest, as they' say, is history'.

Who has Putin really been during his many years in power? We may' have to leave this large, complex question to future historians, when materials for full biographical study -- memoirs, archive documents, and others -- are available. Even so, it may surprise readers to know that Russia's own historians, policy intellectuals, and journalists already argue publicly and differ considerably as to the "pluses and minuses" of Putin's leadership. (My own evaluation is somewhere in the middle.)

In America and elsewhere in the West, however, only purported "minuses" reckon in the extreme vilifying, or anti-cult, of Putin. Many are substantially uninformed, based on highly selective or unverified sources, and motivated by political grievances, including those of several Yeltsin-era oligarchs and their agents in the West.

By identifying and examining, however briefly, the primary "minuses" that underpin the demonization of Putin, we can understand at least who he is not:

  • Putin is not the man who, after coming to power in 2000, "de-democratized" a Russian democracy established by President Boris Yeltsin in the 1990s and restored a system akin to Soviet "totalitarianism." Democratization began and developed in Soviet Russia under the last Soviet leader, Mikhail Gorbachev, in the years from 1987 to 1991.

    Yeltsin repeatedly dealt that historic Russian experiment grievous, possibly fatal, blows. Among his other acts, by using tanks, in October 1993, to destroy Russia's freely elected parliament and with it the entire constitutional order that had made Yeltsin president. By waging two bloody' wars against the tiny breakaway province of Chechnya. By enabling a small group of Kremlin-connected oligarchs to plunder Russia's richest assets and abet the plunging of some two-thirds of its people into poverty' and misery', including the once-large and professionalized Soviet middle classes. By rigging his own reelection in 1996. And by' enacting a "super-presidential" constitution, at the expense of the legislature and judiciary but to his successor's benefit. Putin may have furthered the de-democratization of the Yeltsin 1990s, but he did not initiate it.

  • Nor did Putim then make himself a tsar or Soviet-like autocrat, which means a despot with absolute power to turn his will into policy, the last Kremlin leader with that kind of power was Stalin, who died in 1953, and with him his 20-year mass terror. Due to the increasing bureaucratic routinization of the political-administrative system, each successive Soviet leader had less personal power than his predecessor. Putin may have more, but if he really is a "cold-blooded, ruthless" autocrat -- "the worst dictator on the planet" 1 " -- tens of thousands of protesters would not have repeatedly appeared in Moscow streets, sometimes officially sanctioned. Or their protests (and selective arrests) been shown on state television.

    Political scientists generally agree that Putin has been a "soft authoritarian" leader governing a system that has authoritarian and democratic components inherited from the past. They disagree as to how to specify, define, and balance these elements, but most would also generally agree with a brief Facebook post, on September 7, 2018, by the eminent diplomat-scholar Jack Matlock: "Putin ... is not the absolute dictator some have pictured him. His power seems to be based on balancing various patronage networks, some of which are still criminal. (In the 1990s, most were, and nobody was controlling them.) Therefore he cannot admit publicly that [criminal acts] happened without his approval since this would indicate that he is not completely in charge."

  • Putin is not a Kremlin leader who "reveres Stalin" and whose "Russia is a gangster shadow of Stalin's Soviet Union." 13 , 14 These assertions are so far-fetched and uninfoimed about Stalin's terror-ridden regime, Putin, and Russia today, they barely warrant comment. Stalin's Russia was often as close to unfreedom as imaginable. In today's Russia, apart from varying political liberties, most citizens are freer to live, study, work, write, speak, and travel than they have ever been. (When vocational demonizers like David Kramer allege an "appalling human rights situation in Putin's Russia," 1 " they should be asked: compared to when in Russian history, or elsewhere in the world today?)

    Putin clearly understands that millions of Russians have and often express pro-Stalin sentiments. Nonetheless, his role in these still-ongoing controversies over the despot's historical reputation has been, in one unprecedented way, that of an anti-Stalinist leader. Briefly illustrated, if Putin reveres the memory of Stalin, why did his personal support finally make possible two memorials (the excellent State Museum of the History of the Gulag and the highly evocative "Wall of Grief') to the tyrant's millions of victims, both in central Moscow? The latter memorial monument was first proposed by then-Kremlin leader Nikita Khrushchev, in 1961. It was not built under any of his successors -- until Putin, in 2017.

  • Nor did Putin create post-Soviet Russia's "kleptocratic economic system," with its oligarchic and other widespread corruption. This too took shape under Yeltsin during the Kremlin's shock-therapy "privatization" schemes of the 1990s, when the "swindlers and thieves" still denounced by today's opposition actually emerged.

    Putin has adopted a number of "anti-corruption" policies over the years. How successful they have been is the subject of legitimate debate. As are how much power he has had to rein in fully both Yeltsin's oligarchs and his own, and how sincere he has been. But branding Putin "a kleptocrat" 16 also lacks context and is little more than barely informed demonizing.

    A recent scholarly book finds, for example, that while they may be "corrupt," Putin "and the liberal technocratic economic team on which he relies have also skillfully managed Russia's economic fortunes." 1 ' A former IMF director goes further, concluding that Putin's current economic team does not "tolerate corruption" and that "Russia now ranks 35th out of 190 in the World Bank's Doing Business ratings. It was at 124 in 2010." 18

    Viewed in human teims, when Putin came to power in 2000, some 75 percent of Russians were living in poverty. Most had lost even modest legacies of the Soviet era -- their life savings; medical and other social benefits: real wages; pensions; occupations; and for men life expectancy, which had fallen well below the age of 60. In only a few years, the "kleptocrat" Putin had mobilized enough wealth to undo and reverse those human catastrophes and put billions of dollars in rainy-day funds that buffered the nation in different hard times ahead. We judge this historic achievement as we might, but it is why many Russians still call Putin "Vladimir the Savior."

  • Which brings us to the most sinister allegation against him: Putin, trained as "a KGB thug," regularly orders the killing of inconvenient journalists and personal enemies, like a "mafia state boss." This should be the easiest demonizing axiom to dismiss because there is no actual evidence, or barely any logic, to support it. And yet, it is ubiquitous. Times editorial writers and columnists -- and far from them alone -- characterize Putin as a "thug" and his policies as "thuggery" so often -- sometimes doubling down on "autocratic thug" 19 -- that the practice may be specified in some internal manual. Little wonder so many politicians also routinely practice it, as did US Senator Ben Sasse: "We should tell the American people and tell the world that we know that Vladimir Putin is a thus. He's a former KGB aaent who's a murderer." 20

    Leaving aside other world leaders with minor or major previous careers in intelligences services. Putin's years as a KGB intelligence officer in then -East Germany were clearly formative. Many years later, at age 67. he still spoke of them with pride. Whatever else that experience contributed, it made Putin a Europeanized Russian, a fluent Geiman speaker, and a political leader with a remarkable, demonstrated capacity for retaining and coolly analyzing a very wide range of information. (Read or watch a few of his long interviews.) Not a bad leadership trait in very fraught times.

    Moreover, no serious biographer would treat only one period in a subject's long public career as definitive, as Putin demonizers do. Why not instead the period after he left the KGB in 1991, when he served as deputy to the mayor of St. Petersburg, then considered one of the two or three most democratic leaders in Russia? Or the years immediately following in Moscow, where he saw first-hand the full extent of Yeltsin-era corruption? Or his subsequent years, while still relatively young, as president?

    As for being a "murderer" of journalists and other ''enemies." the list has grown to scores of Russians who died, at home or abroad, by foul or natural causes -- all reflexively attributed to Putin. Our hallowed tradition puts the burden of proof on the accusers. Putin's accusers have produced none, only assumptions, innuendoes, and mistranslated statements by Putin about the fate of "traitors." The two cases that firmly established this defamatory practice were those of the investigative journalist Anna Politkovskaya, who was shot to death in Moscow in 2006; and Alexander Litvinenko, a shadowy one-time KGB defector with ties to aggrieved Yeltsin-era oligarchs, who died of radiation poisoning in London, also in 2006.

    Not a shred of actual proof points to Putin in either case. The editor of Politkovskaya's paper, the devoutly independent Novaya Gazeta. still believes her assassination was ordered by Chechen officials, whose human-rights abuses she was investigating. Regarding Litvinenko, despite frenzied media claims and a kangaroo-like "hearing" suggesting that Putin was "probably" responsible, there is still no conclusive proof even as to whether Litvinenko's poisoning was intentional or accidental. The same paucity of evidence applies to many subsequent cases, notably the shooting of the opposition politician Boris Nemtsov, "in [distant] view of the Kremlin," in 2015.

    About Russian journalists, there is, however, a significant overlooked statistic. According to the American Committee to Protect Journalists, as of 2012, 77 had been murdered -- 41 during the Yeltsin years, 36 under Putin. By 2018, the total was 82 -- 41 under Yeltsin, the same under Putin. This strongly suggests that the still -- pairtially corrupt post-Soviet economic system, not Yeltsin or Putin personally, led to the killing of so many journalists after 1991, most of them investigative reporters. The former wife of one journalist thought to have been poisoned concludes as much: "Many Western analysts place the responsibility for these crimes on Putin. But the cause is more likely the system of mutual responsibility and the culture of impunity that began to form before Putin, in the late 1990s.""

  • More recently, there is yet another allegation: Putin is a fascist and white supremacist. The accusation is made mostly, it seems, by people wishing to deflect attention from the role being played by neo-Nazis in US-backed Ukraine. Putin no doubt regards it as a blood slur, and even on the surface it is, to be exceedingly charitable, entirely uninformed. How else to explain Senator Ron Wyden's solemn warnings, at a hearing on November 1, 2017, about "the current fascist leadership of Russia"? A young scholar recently dismantled a senior Yale professor's nearly inexplicable propounding of this thesis.' 3 My own approach is compatible, though different.

    Whatever Putin's failings, the fascist allegation is absurd. Nothing in his statements over nearly 20 years in power are akin to fascism, whose core belief is a cult of blood based on the asserted superiority of one ethnicity over all others. As head of a vast multi-ethnic state -- embracing scores of diverse groups with a broad range of skin colors -- such utterances or related acts by Putin would be inconceivable, if not political suicide. This is why he endlessly appeals for harmony in "our entire multi-ethnic nation" with its "multi-ethnic culture," as he did once again in his re-inauguration speech in 2018. 24

    Russia has, of course, fascist-white supremacist thinkers and activists, though many have been imprisoned. But a mass fascist movement is scarcely feasible in a country where so many millions died in the war against Nazi Geimany, a war that directly affected Putin and clearly left a formative mark on him. Though he was born after the war, his mother and father barely survived near-fatal wounds and disease, his older brother died in the long German siege of Leningrad, and several of his uncles perished. Only people who never endured such an experience, or are unable to imagine it, can conjure up a fascist Putin.

    There is another, easily understood, indicative fact. Not a trace of anti-Semitism is evident in Putin. Little noted here but widely reported both in Russia and in Israel, life for Russian Jews is better under Putin than it has ever been in that country's long history."

  • Finally, at least for now. there is the ramifying demonization allegation that, as a foreign-policy leader. Putin has been exceedingly "aggressive" abroad and his behavior has been the sole cause of the new cold war. 26 At best, this is an "in-the-eve-of-the-beholder" assertion, and half-blind. At worst, it justifies what even a German foreign minister characterized as the West's "war-mongering" against Russia."

    In the three cases widely given as examples of Putin's "aggression," the evidence, long cited by myself and others, points to US-led instigations, primarily in the process of expanding the NATO military alliance since the late 1990s from Germany to Russia's borders today. The proxy US-Russian war in Georgia in 2008 was initiated by the US-backed president of that country, who had been encouraged to aspire to NATO membership. The 2014 crisis and subsequent proxy war in Ukraine resulted from the longstanding effort to bring that country, despite large regions' shared civilization with Russia, into NATO.

    And Putin's 2015 military intervention in Syria was done on a valid premise: either it would be Syrian President Bashar al-Assad in Damascus or the terrorist Islamic State -- and on President Barack Obama's refusal to join Russia in an anti-ISIS alliance. As a result of this history, Putin is often seen in Russia as a belatedly reactive leader abroad, as a not sufficiently "aggressive" one.

Embedded in the "aggressive Putin" axiom are two others. One is that Putin is a neo-Soviet leader who seeks to restore the Soviet Union at the expense of Russia's neighbors. Fie is obsessively misquoted as having said, in 2005, "The collapse of the Soviet Union was the greatest geopolitical catastrophe of the twentieth century," apparently ranking it above two World Wars. What he actually said was "a major geopolitical catastrophe of the twentieth century," as it was for most Russians.

Though often critical of the Soviet system and its two formative leaders, Lenin and Stalin, Putin, like most of his generation, naturally remains in part a Soviet person. But what he said in 2010 reflects his real perspective and that of very many other Russians: "Anyone who does not regret the break-up of the Soviet Union has no heart. Anyone who wants its rebirth in its previous form has no head." 28 , 29

The other fallacious sub-axiom is that Putin has always been "anti-Western," specifically "anti-American," has "always viewed the United States" with "smoldering suspicions." -- so much that eventually he set into motion a "Plot Against America." 30 , 31 A simple reading of his years in power tells us otherwise. A Westernized Russian, Putin came to the presidency in 2000 in the still prevailing tradition of Gorbachev and Yeltsin -- in hope of a "strategic friendship and partnership" with the United States.

How else to explain Putin's abundant assistant to US forces fighting in Afghanistan after 9/1 1 and continued facilitation of supplying American and NATO troops there? Or his backing of harsh sanctions against Iran's nuclear ambitions and refusal to sell Tehran a highly effective air-defense system? Or the information his intelligence services shared with Washington that if heeded could have prevented the Boston Marathon bombings in April 2012?

Or, until he finally concluded that Russia would never be treated as an equal and that NATO had encroached too close, Putin was a full partner in the US-European clubs of major world leaders? Indeed, as late as May 2018, contrary to Russiagate allegations, he still hoped, as he had from the beginning, to rebuild Russia partly through economic partnerships with the West: "To attract capital from friendly companies and countries, we need good relations with Europe and with the whole world, including the United States." 3 "

Given all that has happened during the past nearly two decades -- particularly what Putin and other Russian leaders perceive to have happened -- it would be remarkable if his views of the W^est, especially America, had not changed. As he remarked in 2018, "We all change." 33

A few years earlier, Putin remarkably admitted that initially he had "illusions" about foreign policy, without specifying which. Perhaps he meant this, spoken at the end of 2017: "Our most serious mistake in relations with the West is that we trusted you too much. And your mistake is that you took that trust as weakness and abused it." 34


P. Philips , December 6, 2018

"In a Time of Universal Deceit -- Telling the Truth Is a Revolutionary Act"

"In a Time of Universal Deceit -- Telling the Truth Is a Revolutionary Act" is a well known quotation (but probably not of George Orwell). And in telling the truth about Russia and that the current "war of nerves" is not in the interests of either the American People or national security, Professor Cohen in this book has in fact done a revolutionary act.

Like a denizen of Plato's cave, or being in the film the Matrix, most people have no idea what the truth is. And the questions raised by Professor Cohen are a great service in the cause of the truth. As Professor Cohen writes in his introduction To His Readers:

"My scholarly work -- my biography of Nikolai Bukharin and essays collected in Rethinking the Soviet Experience and Soviet Fates and Lost Alternatives, for example -- has always been controversial because it has been what scholars term "revisionist" -- reconsiderations, based on new research and perspectives, of prevailing interpretations of Soviet and post-Soviet Russian history. But the "controversy" surrounding me since 2014, mostly in reaction to the contents of this book, has been different -- inspired by usually vacuous, defamatory assaults on me as "Putin's No. 1 American Apologist," "Best Friend," and the like. I never respond specifically to these slurs because they offer no truly substantive criticism of my arguments, only ad hominem attacks. Instead, I argue, as readers will see in the first section, that I am a patriot of American national security, that the orthodox policies my assailants promote are gravely endangering our security, and that therefore we -- I and others they assail -- are patriotic heretics. Here too readers can judge."

Cohen, Stephen F.. War with Russia (Kindle Locations 131-139). Hot Books. Kindle Edition.

Professor Cohen is indeed a patriot of the highest order. The American and "Globalists" elites, particularly the dysfunctional United Kingdom, are engaging in a war of nerves with Russia. This war, which could turn nuclear for reasons discussed in this important book, is of no benefit to any person or nation.

Indeed, with the hysteria on "climate change" isn't it odd that other than Professor Cohen's voice, there are no prominent figures warning of the devastation that nuclear war would bring?

If you are a viewer of one of the legacy media outlets, be it Cable Television networks, with the exception of Tucker Carlson on Fox who has Professor Cohen as a frequent guest, or newspapers such as The New York Times, you have been exposed to falsehoods by remarkably ignorant individuals; ignorant of history, of the true nature of Russia (which defeated the Nazis in Europe at a loss of millions of lives) and most important, of actual military experience. America is neither an invincible or exceptional nation. And for those familiar with terminology of ancient history, it appears the so-called elites are suffering from hubris.

I cannot recommend Professor Cohen's work with sufficient superlatives; his arguments are erudite, clearly stated, supported by the facts and ultimately irrefutable. If enough people find Professor Cohen's work and raise their voices to their oblivious politicians and profiteers from war to stop further confrontation between Russia and America, then this book has served a noble purpose.

If nothing else, educate yourself by reading this work to discover what the *truth* is. And the truth is something sacred.

America and the world owe Professor Cohen a great debt. "Blessed are the peace makers..."

[Mar 31, 2019] George Nader (an adviser to the crown prince of Abu Dhab): Nobody would even waste a cup of coffee on him if it wasn't for who he was married to

Notable quotes:
"... She suggests, "Kushner was increasingly caught up in his own mythology. He was the president's son-in-law, so he apparently thought he was untouchable." (Pg. 114) She notes, "allowing Kushner to work in the administration broke with historical precedent, overruling a string of Justice Department memos that concluded it was illegal for presidents to appoint relatives as White House staff." (Pg. 119) ..."
"... She observes, "Those first few days were chaotic for almost everyone in the new administration. A frantic Reince Priebus would quickly discover that it was impossible to impose any kind of order in this White House, in large part because Trump didn't like order. What Trump liked was having people fight in front of him and then he'd make a decision, just like he'd made snap decisions when his children presented licensing deals for the Trump Organization. This kind of dysfunction enabled a 'floater' like Kushner, whose job was undefined, to weigh in on any topic in front of Trump and have far more influence than he would have had in a top-down hierarchy." (Pg. 125) ..."
Mar 31, 2019 | www.amazon.com

Steven H Propp TOP 50 REVIEWER 5.0 out of 5 stars March 27, 2019

AN INFORMATIVE BOOK ABOUT THE PRESIDENT'S DAUGHTER AND SON-IN-LAW

Author Vicky Ward wrote in the Prologue to this 2019 book, "Donald Trump was celebrating being sworn in as president And the whole world knew that his daughter and son-in-law were his most trusted advisers, ambassadors, and coconspirators. They were an attractive couple---extremely wealthy and, now, extraordinarily powerful. Ivanka looked like Cinderella Ivanka and her husband swept onto the stage, deftly deflecting attention from Donald Trump's clumsy moves, as she had done do often over the past twenty years. The crowd roared in approval They were now America's prince and princess."

She notes, "Jared Kushner learned about the company [his father's] he would later run. Jared was the firm's most sheltered trainee. On his summer vacations, he'd go to work at Kushner Companies construction sites, maybe painting a few walls, more often sitting and listening to music No one dared tell him this probably would not give him a deep understanding of the construction process. But Charlie [Jared's father] doggedly groomed his eldest son for greatness, seeing himself as a Jewish version of Joseph Kennedy " (Pg. 17-18)

She states, "Ivanka had to fight for her father's attention and her ultimate role as the chief heir in his real estate empire When Donald Trump divorced her mother, Ivana she would go out of her way to see more of her father, not less she'd call him during the day and to her delight, he'd always take her call. (Trump's relationship with the two sons he had with Ivana, Don Jr. and Eric, was not nearly so close for years.) 'She was always Daddy's little girl,' said a family friend." (Pg. 32-33) She adds, "As Ivanka matured, physically and emotionally, her father talked openly about how impressed he was with her appearance---a habit he has maintained to this day." (Pg. 35)

She recounts, "at a networking lunch thrown by a diamond heir Jared was introduced to Ivanka Jared and Ivanka quickly became an intriguing gossip column item. They seemed perfectly matched But after a year of dating, they split in part because Jared's parents were dismayed at the idea of their son marrying outside the faith Soon after, Ivanka agreed to convert to Judaism Trump was said to be discombobulated by the enormity of what his daughter had done. Trump, a Presbyterian, who strikes no one as particularly religious, was baffled by his daughter's conversion 'Why should my daughter convert to marry anyone?'" (Pg. 51-53)

She observes, "Ivanka Trump was critical in promoting her husband as the smoother, softer counterpart to his father's volatility.. they could both work a room, ask after people's children, talk without notes, occasionally fake a sense of humor And unlike her husband, she seemed to have a ready command of figures and a detail, working knowledge of all the properties she was involved in Ivanka seemed to control the marital relationship, but she also played the part of devoted, traditional Orthodox wife." (Pg. 70-71)

Of 2016, she states, "No one thought Kushner or Ivanka believed in Trump's populist platform. 'The two of them see this as a networking opportunity,' said a close associate. Because Kushner and Ivanka only fully immersed themselves in Trump's campaign once he became the presumptive Republican nominee they had to push to assert themselves with the campaign staff Kushner quickly got control of the campaign's budget, but he did not have as much authority as he would have liked." (Pg. 74-75) She adds, "Ivanka appeared thrilled by her husband's rising prominence in her father's campaign. It was a huge change from the days when Trump had made belittling jokes about him. If Don Jr. and Eric were irked by the new favorite in Trump's court, they did not show it publicly." (Pg. 85)

She points out, "Trump tweeted an image [Hillary with a backdrop of money and a Star of David] widely viewed as anti-Semitic an 'Observer' writer, criticized Kushner in his own newspaper for standing 'silent and smiling in the background' while Trump made 'repeated accidental winks' to white supremacists Kushner wrote a response [that] insisted that Trump was neither anti-Semitic nor a racist Not all of Kushner's relatives appreciated his efforts to cover Trump's pandering to white supremacists." (Pg. 86-87) Later, she adds, "U.S.-Israel relations was the one political issue anyone in the campaign ever saw Kushner get worked up about." (Pg. 96)

On election night, "Kushner was shocked that Trump never mentioned him in his speech and would later tell people he felt slighted. He was going to find a way to get Trump to notice him more. Ivanka would help him the couple would become known as a single, powerful entity: 'Javanka.'" (Pg. 101) She suggests, "Kushner was increasingly caught up in his own mythology. He was the president's son-in-law, so he apparently thought he was untouchable." (Pg. 114) She notes, "allowing Kushner to work in the administration broke with historical precedent, overruling a string of Justice Department memos that concluded it was illegal for presidents to appoint relatives as White House staff." (Pg. 119)

She observes, "Those first few days were chaotic for almost everyone in the new administration. A frantic Reince Priebus would quickly discover that it was impossible to impose any kind of order in this White House, in large part because Trump didn't like order. What Trump liked was having people fight in front of him and then he'd make a decision, just like he'd made snap decisions when his children presented licensing deals for the Trump Organization. This kind of dysfunction enabled a 'floater' like Kushner, whose job was undefined, to weigh in on any topic in front of Trump and have far more influence than he would have had in a top-down hierarchy." (Pg. 125)

She recounts, "Another epic [Steve] Bannon/Ivanka fight came when bannon was in the Oval Office dining room while Trump was watching TV and eating his lunch Ivanka marched in, claiming Bannon had leaked H.R. McMaster's war plan [Bannon said] 'No, that was leaked by McMaster ' Trump [told her], 'Hey, baby, I think Steve's right on this one ' Bannon thought he would be fired on the spot. But he'd learned something important: much as Trump loved his daughter and hated saying no to her, he was not always controlled by her." (Pg. 138-139)

She notes, "[Ivanka] also found a way to be near Trump when he received phone calls from foreign dignitaries -- while she still owned her business. While Ivanka's behavior was irritating, Kushner was playing a game on a whole different level: he was playing for serious money at the time of the Qatari blockade Kushner's family had been courting the Qataris for financial help and had been turned town. When that story broke the blockade and the Trump administration's response to it suddenly all made sense." (Pg. 156)

Arguing that "Kushner was behind the decision to fire [FBI Director James] Comey" (Pg. 163-164), "Quickly, Trump realized he'd made an error, and blamed Kushner. It seemed clear to Trump's advisers, and not for the first time, that he wished Kushner were not in the White House. He said to Kushner in front of senior staff, 'Just go back to New York, man '" (Pg. 167) She adds, "[Ivanka's] reluctance to speak frankly to her father was the antithesis of the story she had been pushing in the media Ivanka had told Gayle King 'Where I disagree with my father, he knows it. And I express myself with total candor.'" (Pg. 170)

She states, "at the Group of 20 summit in Germany she briefly took her father's seat when he had to step out The gesture seemed to send the message that the U.S. government was now run on nepotism." (Pg. 182)

E-mails from George Nader [an adviser to Shiekh Mohammed bin Zayed Al Nahyan, the crown prince of Abu Dhabi] "made it clear that Kushner's friends in the Gulf mocked him behind his back Nader wrote 'Nobody would even waste a cup of coffee on him if it wasn't for who he was married to.'" (Pg. 206)

She points out, "since October 2017, hundreds of children had been taken from their parents while attempting to cross the U.S.-Mexico border and detained separately news shows everywhere showed heartbreaking images of young children being detained. The next month, Ivanka posted on Instagram a photograph of herself holding her youngest child in his pajamas. Not for the first time, her tone-deaf social media post was slammed as being isolated in her elitist, insulated wealthy world On June 20, Trump signed an executive order that apparently ended the border separations. Minutes later, Ivanka finally spoke publicly on the issue Her tactic here was tell the public you care about an issue; watch silently while your father does the exact opposite; and when he moves a little, take all the credit." (Pg. 225)

She asserts, "Kushner's friendship with a Saudi crown prince was now under widespread scrutiny [because] Rather than expressing moral outrage over the cold-blooded murder of an innocent man [Saudi journalist Jamal Khashoggi], Kushner did what he always does in a crisis: he went quiet." (Pg. 232)

She concludes, "Ivanka Trump has made no secret of the fact that she wants to be the most powerful woman in the world. Her father's reign in Washington, D.C., is, she believes, the beginning of a great American dynasty Ivanka has been carefully positioning herself as [Trump's] political heir " (Pg. 236)

While not as "scandalous" as the book's subtitle might suggest, this is a very interesting book that will be of great interest to those wanting information about these crucial members of the Trump family and presidency.

[Mar 28, 2019] Was MAGA is con job ?

Notable quotes:
"... Until the Crash of the Great Recession, after which we entered a "Punitive" stage, blaming "Those Others" for buying into faulty housing deals, for wanting a safety net of health care insurance, for resurgent terrorism beyond our borders, and, as the article above indicates, for having an equal citizen's voice in the electoral process. ..."
"... What needs to be restored is the purpose that "the economy works for the PEOPLE of the nation", not the other way around, as we've witnessed for the last four decades. ..."
Feb 26, 2019 | www.amazon.com

Kindle Customer, December 8, 2018

5.0 out of 5 stars How and Why the MAGA-myth Consumed Itself

Just finished reading this excellent book on how corporatist NeoLiberalism and the Xristianists merged their ideologies to form the Conservative Coalition in the 1970s, and to then hijack the RepubliCAN party of Abe, Teddy, Ike (and Poppy Bush).

The author describes three phases of the RepugliCONs' zero-sum game:

The "Combative" stage of Reagan sought to restore "family values" (aka patriarchal hierarchy) to the moral depravity of Sixties youth and the uppity claims to equal rights by blacks and feminists.

In the "Normative" stage of Gingrich and W Bush, the NeoConservatives claimed victory over Godless Communism and the NeoLibs took credit for an expanding economy (due mostly by technology, not to Fed policy). They were happy to say "Aren't you happy now?" with sole ownership of the Free World and its markets, yet ignoring various Black Swan events and global trends they actually had no control over.

Until the Crash of the Great Recession, after which we entered a "Punitive" stage, blaming "Those Others" for buying into faulty housing deals, for wanting a safety net of health care insurance, for resurgent terrorism beyond our borders, and, as the article above indicates, for having an equal citizen's voice in the electoral process.

What was unexpected was that the libertarian mutiny by the TeaParty would become so nasty and vicious, leading to the Pirate Trump to scavenge what little was left of American Democracy for his own treasure.

What needs to be restored is the purpose that "the economy works for the PEOPLE of the nation", not the other way around, as we've witnessed for the last four decades.

[Feb 17, 2019] Death of the Public University Uncertain Futures for Higher Education in the Knowledge Economy (Higher Education

Notable quotes:
"... Administration bloat and academic decline is another prominent feature of the neoliberal university. University presidents now view themselves as CEO and want similar salaries. ..."
Feb 17, 2019 | www.amazon.com

Customer Review

skeptic 5.0 out of 5 stars February 11, 2019 Format: Kindle Edition

The eyes opening, very important for any student or educator book

This book is the collection of more than dozen of essays of various authors, but even the Introduction (Privatizing the Public University: Key Trends, Countertrends, and Alternatives) is worth the price of the book

Trends in neo-liberalization of university education are not new. But recently they took a more dangerous turn. And they are not easy to decipher, despite the fact that they are greatly affect the life of each student or educator. In this sense this is really an eyes-opening book.

In Europe previously higher education as assessable for free or almost free, but for talented student only. Admission criteria were strict and checked via written and oral entrance exams on key subjects. Now the tend is to view university as business that get customers, charge them exorbitant fees and those customers get diploma as hamburgers in McDonalds at the end for their money. Whether those degree are worth money charged, or not and were suitable for the particular student of not (many are "fake" degrees with little or no chances for getting employment) is not university business. On the contrary marketing is used to attract as many students as possible and many of those student now remain in debt for large part of their adult life.

In other words, the neoliberalization of the university in the USA creates new, now dominant trend -- the conversion of the university into for-profit diploma mills, which are essentially a new type of rent-seeking (and they even attract speculative financial capital and open scamsters, like was in case of "Trump University" ). Even old universities with more than a century history more and more resemble diploma mills.

This assault on academic freedom by neoliberalism justifies itself by calling for "transparency" and "accountability" to the taxpayer and the public. But it operates used utter perversion of those terms. In the Neoliberal context, they mean "total surveillance" and "rampant rent-seeking."

Neoliberalism has converted education from a public good to a personal investment in the future, a future conceived in terms of earning capacity. As this is about your future earning potential, it is logical that for a chance to increase it you need to take a loan.

Significantly, in the same period per capita, spending on prisons increased by 126 percent (Newfield 2008: 266). Between the 1970s and 1990s there was a 400 percent increase in charges in tuition, room, and board in U.S. universities and tuition costs have grown at about ten times the rate of family income (ibid.). What these instances highlight is not just the state's retreat from direct funding of higher education but also a calculated initiative to enable private companies to capture and profit from tax-funded student loans.

The other tendency is also alarming. Funds now are allocated to those institutions that performed best in what has become a fetishistic quest for ever-higher ratings. That creates the 'rankings arms-race.' It has very little or nothing to do with the quality of teaching of students in a particular university. On the contrary, the curriculums were "streamlines" and "ideologically charged courses" such as neoclassical economics are now required for graduation even in STEM specialties.

In the neoliberal university professors are now under the iron heel of management and various metrics were invented to measure the "quality of teaching." Most of them are very perverted or can be perverted as when a measurement becomes a target; teachers start to focus their resources and activities primarily on what 'counts' rather than on their wider competencies, professional ethics and societal goals (see Kohn and Shore, this volume).

Administration bloat and academic decline is another prominent feature of the neoliberal university. University presidents now view themselves as CEO and want similar salaries. The same is true for the growing staff of university administrators. The recruitment of administrators has far outpaced the growth in the number of faculty – or even students. Meanwhile, universities claim to be struggling with budget crises that force to reduce permanent academic posts, and widely use underpaid and overworked adjunct staff – the 'precariat' paid just a couple of thousand dollars per course and often existing on the edge of poverty, or in real poverty.

Money now is the key objective and the mission changed from cultural to "for profit" business including vast expenses on advancement of the prestige and competitiveness of the university as an end in itself. Ability to get grants is now an important criteria of getting the tenure.

[Jan 14, 2019] Spygate: The Attempted Sabotage of Donald J. Trump

Notable quotes:
"... Elections are just for show like many trials in the old USSR. The in power Party is the power NOT the individual voting citizens. In the end this book is about exposing the pernicious activities of those who would place themselves above the voting citizens of America. ..."
Jan 14, 2019 | www.amazon.com

Johnny G 5.0 out of 5 stars The Complex Made Easy! October 9, 2018 Format: Hardcover Verified Purchase

Regardless of your politics this is a must read book. The authors do a wonderful job of peeling back the layered onion that is being referred to as "Spy Gate." The book reads like an imaginative spy thriller. Except it is as real a fist in the stomach or the death of your best friend. In this case it is our Constitution that is victimized by individuals entrusted with "protecting and defending it from all enemies DOMESTIC and foreign."

Tis is in many ways a sad tail of ambition, weak men, political operatives & hubris ridden bureaucrats. The end result IF this type of activity is not punished and roundly condemned by ALL Americans could be a descent into Solzhenitsyn's GULAG type of Deep State government run by unaccountable political appointees and bureaucrats.

Elections are just for show like many trials in the old USSR. The in power Party is the power NOT the individual voting citizens. In the end this book is about exposing the pernicious activities of those who would place themselves above the voting citizens of America. ALL Americans should be aware of those forces seen and unseen that seek to injure our Constitutional Republic. This book is footnoted extensively lest anyone believes it is a polemic political offering.

JAK 5.0 out of 5 stars The truth hurts and that's the truth October 11, 2018 Format: Hardcover Verified Purchase

This book has content that you will not see or find anywhere else. while the topic itself is covered elsewhere in large mainstream media Outlets the truth of what is actually happening is rarely ever exposed.

If there was a six-star recommendation or anything higher because the truth is all that matters, he would receive it.

This book is put together with so many far-left (CNN, BLOOMBERG, DLSTE, YAHOO ECT) leading news stories as being able to support the fact of what happened, it's possible to say oh well that just didn't happen but it was reported by the left and when you put all of the pieces of the puzzle together it is painfully obvious to see what happened......

If these people involved don't go to jail the death of our Republic has already happened

[Dec 12, 2018] The Neoliberal Agenda and the Student Debt Crisis in U.S. Higher Education (Routledge Studies in Education)

Notable quotes:
"... Neoliberalism's presence in higher education is making matters worse for students and the student debt crisis, not better. ..."
"... Cannan and Shumar (2008) focus their attention on resisting, transforming, and dismantling the neoliberal paradigm in higher education. They ask how can market-based reform serve as the solution to the problem neoliberal practices and policies have engineered? ..."
"... What got us to where we are (escalating tuition costs, declining state monies, and increasing neoliberal influence in higher education) cannot get us out of the SI.4 trillion problem. And yet this metaphor may, in fact, be more apropos than most of us on the right, left, or center are as yet seeing because we mistakenly assume the market we have is the only or best one possible. ..."
"... We only have to realize that the emperor has no clothes and reveal this reality. ..."
"... Indeed, the approach our money-dependent and money-driven legislators and policymakers have employed has been neoliberal in form and function, and it will continue to be so unless we help them to see the light or get out of the way. This book focuses on the $1.4+ trillion student debt crisis in the United States. It doesn't share hard and fast solutions per se. ..."
"... In 2011-2012, 50% of bachelor's degree recipients from for-profit institutions borrowed more than $40,000 and about 28% of associate degree recipients from for-profit institutions borrowed more than $30,000 (College Board, 2015a). ..."
Dec 12, 2018 | www.amazon.com

Despite tthe fact that necoliberalism brings poor economic growth, inadequate availability of jobs and career opportunities, and the concentration of economic and social rewards in the hands of a privileged upper class resistance to it, espcially at universities, remain weak to non-existant.

The first sign of high levels of dissatisfaction with neoliberalism was the election of Trump (who, of course, betrayed all his elections promises, much like Obma before him). As a result, the legitimation of neoliberalism based on references to the efficient
and effective functioning of the market (ideological legitimation) is
exhausted while wealth redistribution practices (material legitimation) are
not practiced and, in fact, considered unacceptable.

Despite these problems, resistance to neoliberalism remains weak.
Strategics and actions of opposition have been shifted from the sphere of
labor to that of the market creating a situation in which the idea of the
superiority and desirability of the market is shared by dominant and
oppositional groups alike. Even emancipatory movements such as women,
race, ethnicity, and sexual orientation have espoused individualistic,
competition-centered, and meritocratic views typical of ncolibcral dis-
courses. Moreover, corporate forces have colonized spaces and discourses
that have traditionally been employed by oppositional groups and move-
ments. However, as systemic instability' continues and capital accumulation
needs to be achieved, change is necessary. Given the weakness of opposi-
tion, this change is led by corporate forces that will continue to further
their interests but will also attempt to mitigate socio-economic contra-
dictions. The unavailability of ideological mechanisms to legitimize
ncolibcral arrangements will motivate dominant social actors to make
marginal concessions (material legitimation) to subordinate groups. These
changes, however, will not alter the corporate co-optation and distortion of
discourses that historically defined left-leaning opposition. As contradic-
tions continue, however, their unsustainability will represent a real, albeit
difficult, possibility for anti-neoliberal aggregation and substantive change.

Connolly (2016) reported that a poll shows that some graduated student loan borrowers would willingly go to extremes to pay off outstanding student debt. Those extremes include experiencing physical pain and suffering and even a reduced lifespan. For instance, 35% of those polled would take one year off life expectancy and 6.5% would willingly cut off their pinky finger if it meant ridding themselves of the student loan debt they currently held.

Neoliberalism's presence in higher education is making matters worse for students and the student debt crisis, not better. In their book Structure and Agency in the Neoliberal University, Cannan and Shumar (2008) focus their attention on resisting, transforming, and dismantling the neoliberal paradigm in higher education. They ask how can market-based reform serve as the solution to the problem neoliberal practices and policies have engineered?

It is like an individual who loses his keys at night and who decides to look only beneath the street light. This may be convenient because there is light, but it might not be where the keys are located. This metaphorical example could relate to the student debt crisis. What got us to where we are (escalating tuition costs, declining state monies, and increasing neoliberal influence in higher education) cannot get us out of the SI.4 trillion problem. And yet this metaphor may, in fact, be more apropos than most of us on the right, left, or center are as yet seeing because we mistakenly assume the market we have is the only or best one possible.

As Lucille (this volume) strives to expose, the systemic cause of our problem is "hidden in plain sight," right there in the street light for all who look carefully enough to see. We only have to realize that the emperor has no clothes and reveal this reality. If and when a critical mass of us do, systemic change in our monetary exchange relations can and, we hope, will become our funnel toward a sustainable and socially, economically, and ecologically just future where public education and democracy can finally become realities rather than merely ideals.

Indeed, the approach our money-dependent and money-driven legislators and policymakers have employed has been neoliberal in form and function, and it will continue to be so unless we help them to see the light or get out of the way. This book focuses on the $1.4+ trillion student debt crisis in the United States. It doesn't share hard and fast solutions per se. Rather, it addresses real questions (and their real consequences). Are collegians overestimating the economic value of going to college?

What are we, they, and our so-called elected leaders failing or refusing to sec and why? This critically minded, soul-searching volume shares territory with, yet pushes beyond, that of Akers and Chingos (2016), Baum (2016), Goldrick-Rab (2016), Graebcr (2011), and Johannscn (2016) in ways that we trust those critically minded authors -- and others concerned with our mess of debts, public and private, and unfulfilled human potential -- will find enlightening and even ground-breaking.

... ... ...

In the meantime, college costs have significantly increased over the past fifty years. The average cost of tuition and fees (excluding room and board) for public four-year institutions for a full year has increased from 52,387 (in 2015 dollars) for the 1975-1976 academic year, to 59,410 for 2015-2016. The tuition for public two-year colleges averaged $1,079 in 1975-1976 (in 2015 dollars) and increased to $3,435 for 2015-2016. At private non-profit four-year institutions, the average 1975-1976 cost of tuition and fees (excluding room and board) was $10,088 (in 2015 dollars), which increased to $32,405 for 2015-2016 (College Board, 2015b).

The purchasing power of Pell Grants has decreased. In fact, the maximum Pell Grants coverage of public four-year tuition and fees decreased from 83% in 1995-1996 to 61% in 2015-2016. The maximum Pell Grants coverage of private non-profit four-year tuition and fees decreased from 19% in 1995-1996 to 18% in 2015-2016 (College Board, 2015a).

... ... ....

... In 2013-2014, 61% of bachelor's degree recipients from public and private non-profit four-year institutions graduated with an average debt of $16,300 per graduate. In 2011-2012, 50% of bachelor's degree recipients from for-profit institutions borrowed more than $40,000 and about 28% of associate degree recipients from for-profit institutions borrowed more than $30,000 (College Board, 2015a).

Rising student debt has become a key issue of higher education finance among many policymakers and researchers. Recently, the government has implemented a series of measures to address student debt. In 2005, the Bankruptcy Abuse Prevention and Consumer Protection Act (2005) was passed, which barred the discharge of all student loans through bankruptcy for most borrowers (Collinge, 2009). This was the final nail in the bankruptcy coffin, which had begun in 1976 with a five-year ban on student loan debt (SLD) bankruptcy and was extended to seven years in 1990. Then in 1998, it became a permanent ban for all who could not clear a relatively high bar of undue hardship (Best 6c Best, 2014).

By 2006, Sallie Mae had become the nation's largest private student loan lender, reporting loan holdings of $123 billion. Its fee income collected from defaulted loans grew from $280 million in 2000 to $920 million in 2005 (Collinge, 2009). In 2007, in response to growing student default rates, the College Cost Reduction Act was passed to provide loan forgiveness for student loan borrowers who work full-time in a public service job. The Federal Direct Loan will be forgiven after 120 payments were made. This Act also provided other benefits for students to pay for their postsecondary education, such as lowering interest rates of GSL, increasing the maximum amount of Pell Grant (though, as noted above, not sufficiently to meet rising tuition rates), as well as reducing guarantor collection fees (Collinge, 2009).

In 2008, the Higher Education Opportunity Act (2008) was passed to increase transparency and accountability. This Act required institutions that are participating in federal financial aid programs to post a college price calculator on their websites in order to provide better college cost information for students and families (U.S. Department of Education |U.S. DoE|, 2015a). Due to the recession of 2008, the American Opportunity Tax Credit of 2009 (AOTC) was passed to expand the Hope Tax Credit program, in which the amount of tax credit increased to 100% for the first $2,000 of qualified educational expenses and was reduced to 25% of the second $2,000 in college expenses. The total credit cap increased from $1,500 to $2,500 per student. As a result, the federal spending on education tax benefits had a large increase since then (Crandall-Hollick, 2014), benefits that, again, are reaped only by those who file income taxes.

[Nov 05, 2018] How neoliberals destroyed University education and then a large part of the US middle class and the US postwar social order by Edward Qualtrough

Notable quotes:
"... Every academic critique of neoliberalism is an unacknowledged memoir. We academics occupy a crucial node in the neoliberal system. Our institutions are foundational to neoliberalism's claim to be a meritocracy, insofar as we are tasked with discerning and certifying the merit that leads to the most powerful and desirable jobs. Yet at the same time, colleges and universities have suffered the fate of all public goods under the neoliberal order. We must therefore "do more with less," cutting costs while meeting ever-greater demands. The academic workforce faces increasing precarity and shrinking wages even as it is called on to teach and assess more students than ever before in human history -- and to demonstrate that we are doing so better than ever, via newly devised regimes of outcome-based assessment. In short, we academics live out the contradictions of neoliberalism every day. ..."
"... Whereas classical liberalism insisted that capitalism had to be allowed free rein within its sphere, under neoliberalism capitalism no longer has a set sphere. We are always "on the clock," always accruing (or squandering) various forms of financial and social capital. ..."
Aug 24, 2016 | www.amazon.com

From: Amazon.com Neoliberalism's Demons On the Political Theology of Late Capital (9781503607125) Adam Kotsko Books

Every academic critique of neoliberalism is an unacknowledged memoir. We academics occupy a crucial node in the neoliberal system. Our institutions are foundational to neoliberalism's claim to be a meritocracy, insofar as we are tasked with discerning and certifying the merit that leads to the most powerful and desirable jobs. Yet at the same time, colleges and universities have suffered the fate of all public goods under the neoliberal order. We must therefore "do more with less," cutting costs while meeting ever-greater demands. The academic workforce faces increasing precarity and shrinking wages even as it is called on to teach and assess more students than ever before in human history -- and to demonstrate that we are doing so better than ever, via newly devised regimes of outcome-based assessment. In short, we academics live out the contradictions of neoliberalism every day.

... ... ...

On a more personal level it reflects my upbringing in the suburbs of Flint, Michigan, a city that has been utterly devastated by the transition to neoliberalism. As I lived through the slow-motion disaster of the gradual withdrawal of the auto industry, I often heard Henry Ford s dictum that a company could make more money if the workers were paid enough to be customers as well, a principle that the major US automakers were inexplicably abandoning. Hence I find it [Fordism -- NNB] to be an elegant way of capturing the postwar model's promise of creating broadly shared prosperity by retooling capitalism to produce a consumer society characterized by a growing middle class -- and of emphasizing the fact that that promise was ultimately broken.

By the mid-1970s, the postwar Fordist order had begun to breakdown to varying degrees in the major Western countries. While many powerful groups advocated a response to the crisis that would strengthen the welfare state, the agenda that wound up carrying the day was neoliberalism, which was most forcefully implemented in the United Kingdom by Margaret Thatcher and in the United States by Ronald Reagan. And although this transformation was begun by the conservative part)', in both countries the left-of-centcr or (in American usage) "liberal"party wound up embracing neoliberal tenets under Tony Blair and Bill Clinton, ostensibly for the purpose of directing them toward progressive ends.

With the context of current debates within the US Democratic Party, this means that Clinton acolytes are correct to claim that "neoliberalism" just is liberalism but only to the extent that, in the contemporary United States, the term liberalism is little more than a word for whatever the policy agenda of the Democratic Party happens to be at any given time. Though politicians of all stripes at times used libertarian rhetoric to sell their policies, the most clear-eyed advocates of neoliberalism realized that there could be no simple question of a "return" to the laissez-faire model.

Rather than simply getting the state "out of the way," they both deployed and transformed state power, including the institutions of the welfare state, to reshape society in accordance with market models. In some cases creating markets where none had previously existed, as in the privatization of education and other public services. In others it took the form of a more general spread of a competitive market ethos into ever more areas of life -- so that we are encouraged to think of our reputation as a "brand," for instance, or our social contacts as fodder for "networking." Whereas classical liberalism insisted that capitalism had to be allowed free rein within its sphere, under neoliberalism capitalism no longer has a set sphere. We are always "on the clock," always accruing (or squandering) various forms of financial and social capital.

[Mar 19, 2018] PyCharm - Python IDE Full Review

An increasingly popular installation method: "snap install pycharm-community --classic".
Mar 19, 2018 | www.linuxandubuntu.com

​Pycharm is a powerful Integrated Development Environment that can be used to develop Python applications, web apps, and even data analysis tools. Pycharm has everything a python developer needs to develop. The IDE is full of surprises and keyboard shortcuts that will leave you impressed and at the same time satisfied that your projects are completed on time. Good work from JetBrains. Couldn't have done any better.

[Dec 16, 2017] 3. Data model -- Python 3.6.4rc1 documentation

Notable quotes:
"... __slots__ ..."
"... Note that the current implementation only supports function attributes on user-defined functions. Function attributes on built-in functions may be supported in the future. ..."
"... generator function ..."
"... coroutine function ..."
"... asynchronous generator function ..."
"... operator overloading ..."
"... __init_subclass__ ..."
"... context manager ..."
"... asynchronous iterable ..."
"... asynchronous iterator ..."
"... asynchronous iterator ..."
"... asynchronous context manager ..."
"... context manager ..."
Dec 16, 2017 | docs.python.org
Table Of Contents Previous topic

3. Data model

3.1. Objects, values and types

Objects are Python's abstraction for data. All data in a Python program is represented by objects or by relations between objects. (In a sense, and in conformance to Von Neumann's model of a "stored program computer," code is also represented by objects.)

Every object has an identity, a type and a value. An object's identity never changes once it has been created; you may think of it as the object's address in memory. The ' is ' operator compares the identity of two objects; the id() function returns an integer representing its identity.

CPython implementation detail: For CPython, id(x) is the memory address where is stored.

An object's type determines the operations that the object supports (e.g., "does it have a length?") and also defines the possible values for objects of that type. The type() function returns an object's type (which is an object itself). Like its identity, an object's type is also unchangeable. [1]

The value of some objects can change. Objects whose value can change are said to be mutable ; objects whose value is unchangeable once they are created are called immutable . (The value of an immutable container object that contains a reference to a mutable object can change when the latter's value is changed; however the container is still considered immutable, because the collection of objects it contains cannot be changed. So, immutability is not strictly the same as having an unchangeable value, it is more subtle.) An object's mutability is determined by its type; for instance, numbers, strings and tuples are immutable, while dictionaries and lists are mutable.

Objects are never explicitly destroyed; however, when they become unreachable they may be garbage-collected. An implementation is allowed to postpone garbage collection or omit it altogether -- it is a matter of implementation quality how garbage collection is implemented, as long as no objects are collected that are still reachable.

CPython implementation detail: CPython currently uses a reference-counting scheme with (optional) delayed detection of cyclically linked garbage, which collects most objects as soon as they become unreachable, but is not guaranteed to collect garbage containing circular references. See the documentation of the gc module for information on controlling the collection of cyclic garbage. Other implementations act differently and CPython may change. Do not depend on immediate finalization of objects when they become unreachable (so you should always close files explicitly).

Note that the use of the implementation's tracing or debugging facilities may keep objects alive that would normally be collectable. Also note that catching an exception with a ' try except ' statement may keep objects alive.

Some objects contain references to "external" resources such as open files or windows. It is understood that these resources are freed when the object is garbage-collected, but since garbage collection is not guaranteed to happen, such objects also provide an explicit way to release the external resource, usually a close() method. Programs are strongly recommended to explicitly close such objects. The ' try finally ' statement and the ' with ' statement provide convenient ways to do this.

Some objects contain references to other objects; these are called containers . Examples of containers are tuples, lists and dictionaries. The references are part of a container's value. In most cases, when we talk about the value of a container, we imply the values, not the identities of the contained objects; however, when we talk about the mutability of a container, only the identities of the immediately contained objects are implied. So, if an immutable container (like a tuple) contains a reference to a mutable object, its value changes if that mutable object is changed.

Types affect almost all aspects of object behavior. Even the importance of object identity is affected in some sense: for immutable types, operations that compute new values may actually return a reference to any existing object with the same type and value, while for mutable objects this is not allowed. E.g., after 1; , and may or may not refer to the same object with the value one, depending on the implementation, but after []; [] , and are guaranteed to refer to two different, unique, newly created empty lists. (Note that [] assigns the same object to both and .) 3.2. The standard type hierarchy

Below is a list of the types that are built into Python. Extension modules (written in C, Java, or other languages, depending on the implementation) can define additional types. Future versions of Python may add types to the type hierarchy (e.g., rational numbers, efficiently stored arrays of integers, etc.), although such additions will often be provided via the standard library instead.

Some of the type descriptions below contain a paragraph listing 'special attributes.' These are attributes that provide access to the implementation and are not intended for general use. Their definition may change in the future.

None

This type has a single value. There is a single object with this value. This object is accessed through the built-in name None . It is used to signify the absence of a value in many situations, e.g., it is returned from functions that don't explicitly return anything. Its truth value is false.

NotImplemented

This type has a single value. There is a single object with this value. This object is accessed through the built-in name NotImplemented . Numeric methods and rich comparison methods should return this value if they do not implement the operation for the operands provided. (The interpreter will then try the reflected operation, or some other fallback, depending on the operator.) Its truth value is true.

See Implementing the arithmetic operations for more details.

Ellipsis

This type has a single value. There is a single object with this value. This object is accessed through the literal ... or the built-in name Ellipsis . Its truth value is true.

numbers.Number

These are created by numeric literals and returned as results by arithmetic operators and arithmetic built-in functions. Numeric objects are immutable; once created their value never changes. Python numbers are of course strongly related to mathematical numbers, but subject to the limitations of numerical representation in computers.

Python distinguishes between integers, floating point numbers, and complex numbers:

numbers.Integral

These represent elements from the mathematical set of integers (positive and negative).

There are two types of integers:

Integers ( int )

These represent numbers in an unlimited range, subject to available (virtual) memory only. For the purpose of shift and mask operations, a binary representation is assumed, and negative numbers are represented in a variant of 2's complement which gives the illusion of an infinite string of sign bits extending to the left.
Booleans ( bool )

These represent the truth values False and True. The two objects representing the values False and True are the only Boolean objects. The Boolean type is a subtype of the integer type, and Boolean values behave like the values 0 and 1, respectively, in almost all contexts, the exception being that when converted to a string, the strings "False" or "True" are returned, respectively.

The rules for integer representation are intended to give the most meaningful interpretation of shift and mask operations involving negative integers.

numbers.Real ( float )

These represent machine-level double precision floating point numbers. You are at the mercy of the underlying machine architecture (and C or Java implementation) for the accepted range and handling of overflow. Python does not support single-precision floating point numbers; the savings in processor and memory usage that are usually the reason for using these are dwarfed by the overhead of using objects in Python, so there is no reason to complicate the language with two kinds of floating point numbers.

numbers.Complex ( complex )

These represent complex numbers as a pair of machine-level double precision floating point numbers. The same caveats apply as for floating point numbers. The real and imaginary parts of a complex number can be retrieved through the read-only attributes z.real and z.imag .

Sequences

These represent finite ordered sets indexed by non-negative numbers. The built-in function len() returns the number of items of a sequence. When the length of a sequence is n , the index set contains the numbers 0, 1, , n -1. Item i of sequence a is selected by a[i] .

Sequences also support slicing: a[i:j] selects all items with index k such that i <= k < j . When used as an expression, a slice is a sequence of the same type. This implies that the index set is renumbered so that it starts at 0.

Some sequences also support "extended slicing" with a third "step" parameter: a[i:j:k] selects all items of a with index x where n*k , n >= and i <= x < j .

Sequences are distinguished according to their mutability:

Immutable sequences

An object of an immutable sequence type cannot change once it is created. (If the object contains references to other objects, these other objects may be mutable and may be changed; however, the collection of objects directly referenced by an immutable object cannot change.)

The following types are immutable sequences:

Strings

A string is a sequence of values that represent Unicode code points. All the code points in the range U+0000 U+10FFFF can be represented in a string. Python doesn't have a char type; instead, every code point in the string is represented as a string object with length . The built-in function ord() converts a code point from its string form to an integer in the range 10FFFF ; chr() converts an integer in the range 10FFFF to the corresponding length string object. str.encode() can be used to convert a str to bytes using the given text encoding, and bytes.decode() can be used to achieve the opposite.

Tuples

The items of a tuple are arbitrary Python objects. Tuples of two or more items are formed by comma-separated lists of expressions. A tuple of one item (a 'singleton') can be formed by affixing a comma to an expression (an expression by itself does not create a tuple, since parentheses must be usable for grouping of expressions). An empty tuple can be formed by an empty pair of parentheses.

Bytes

A bytes object is an immutable array. The items are 8-bit bytes, represented by integers in the range 0 <= x < 256. Bytes literals (like b'abc' ) and the built-in bytes() constructor can be used to create bytes objects. Also, bytes objects can be decoded to strings via the decode() method.

Mutable sequences

Mutable sequences can be changed after they are created. The subscription and slicing notations can be used as the target of assignment and del (delete) statements.

There are currently two intrinsic mutable sequence types:

Lists

The items of a list are arbitrary Python objects. Lists are formed by placing a comma-separated list of expressions in square brackets. (Note that there are no special cases needed to form lists of length 0 or 1.)

Byte Arrays

A bytearray object is a mutable array. They are created by the built-in bytearray() constructor. Aside from being mutable (and hence unhashable), byte arrays otherwise provide the same interface and functionality as immutable bytes objects.

The extension module array provides an additional example of a mutable sequence type, as does the collections module.

Set types

These represent unordered, finite sets of unique, immutable objects. As such, they cannot be indexed by any subscript. However, they can be iterated over, and the built-in function len() returns the number of items in a set. Common uses for sets are fast membership testing, removing duplicates from a sequence, and computing mathematical operations such as intersection, union, difference, and symmetric difference.

For set elements, the same immutability rules apply as for dictionary keys. Note that numeric types obey the normal rules for numeric comparison: if two numbers compare equal (e.g., and 1.0 ), only one of them can be contained in a set.

There are currently two intrinsic set types:

Sets

These represent a mutable set. They are created by the built-in set() constructor and can be modified afterwards by several methods, such as add() .

Frozen sets

These represent an immutable set. They are created by the built-in frozenset() constructor. As a frozenset is immutable and hashable , it can be used again as an element of another set, or as a dictionary key.

Mappings

These represent finite sets of objects indexed by arbitrary index sets. The subscript notation a[k] selects the item indexed by from the mapping ; this can be used in expressions and as the target of assignments or del statements. The built-in function len() returns the number of items in a mapping.

There is currently a single intrinsic mapping type:

Dictionaries

These represent finite sets of objects indexed by nearly arbitrary values. The only types of values not acceptable as keys are values containing lists or dictionaries or other mutable types that are compared by value rather than by object identity, the reason being that the efficient implementation of dictionaries requires a key's hash value to remain constant. Numeric types used for keys obey the normal rules for numeric comparison: if two numbers compare equal (e.g., and 1.0 ) then they can be used interchangeably to index the same dictionary entry.

Dictionaries are mutable; they can be created by the {...} notation (see section Dictionary displays ).

The extension modules dbm.ndbm and dbm.gnu provide additional examples of mapping types, as does the collections module.

Callable types

These are the types to which the function call operation (see section Calls ) can be applied:

User-defined functions

A user-defined function object is created by a function definition (see section Function definitions ). It should be called with an argument list containing the same number of items as the function's formal parameter list.

Special attributes:

Attribute Meaning
__doc__ The function's documentation string, or None if unavailable; not inherited by subclasses Writable
__name__ The function's name Writable
__qualname__

The function's qualified name

New in version 3.3.
Writable
__module__ The name of the module the function was defined in, or None if unavailable. Writable
__defaults__ A tuple containing default argument values for those arguments that have defaults, or None if no arguments have a default value Writable
__code__ The code object representing the compiled function body. Writable
__globals__ A reference to the dictionary that holds the function's global variables -- the global namespace of the module in which the function was defined. Read-only
__dict__ The namespace supporting arbitrary function attributes. Writable
__closure__ None or a tuple of cells that contain bindings for the function's free variables. Read-only
__annotations__ A dict containing annotations of parameters. The keys of the dict are the parameter names, and 'return' for the return annotation, if provided. Writable
__kwdefaults__ A dict containing defaults for keyword-only parameters. Writable

Most of the attributes labelled "Writable" check the type of the assigned value.

Function objects also support getting and setting arbitrary attributes, which can be used, for example, to attach metadata to functions. Regular attribute dot-notation is used to get and set such attributes. Note that the current implementation only supports function attributes on user-defined functions. Function attributes on built-in functions may be supported in the future.

Additional information about a function's definition can be retrieved from its code object; see the description of internal types below.

Instance methods

An instance method object combines a class, a class instance and any callable object (normally a user-defined function).

Special read-only attributes: __self__ is the class instance object, __func__ is the function object; __doc__ is the method's documentation (same as __func__.__doc__ ); __name__ is the method name (same as __func__.__name__ ); __module__ is the name of the module the method was defined in, or None if unavailable.

Methods also support accessing (but not setting) the arbitrary function attributes on the underlying function object.

User-defined method objects may be created when getting an attribute of a class (perhaps via an instance of that class), if that attribute is a user-defined function object or a class method object.

When an instance method object is created by retrieving a user-defined function object from a class via one of its instances, its __self__ attribute is the instance, and the method object is said to be bound. The new method's __func__ attribute is the original function object.

When a user-defined method object is created by retrieving another method object from a class or instance, the behaviour is the same as for a function object, except that the __func__ attribute of the new instance is not the original method object but its __func__ attribute.

When an instance method object is created by retrieving a class method object from a class or instance, its __self__ attribute is the class itself, and its __func__ attribute is the function object underlying the class method.

When an instance method object is called, the underlying function ( __func__ ) is called, inserting the class instance ( __self__ ) in front of the argument list. For instance, when is a class which contains a definition for a function f() , and is an instance of , calling x.f(1) is equivalent to calling C.f(x, 1) .

When an instance method object is derived from a class method object, the "class instance" stored in __self__ will actually be the class itself, so that calling either x.f(1) or C.f(1) is equivalent to calling f(C,1) where is the underlying function.

Note that the transformation from function object to instance method object happens each time the attribute is retrieved from the instance. In some cases, a fruitful optimization is to assign the attribute to a local variable and call that local variable. Also notice that this transformation only happens for user-defined functions; other callable objects (and all non-callable objects) are retrieved without transformation. It is also important to note that user-defined functions which are attributes of a class instance are not converted to bound methods; this only happens when the function is an attribute of the class.

Generator functions

A function or method which uses the yield statement (see section The yield statement ) is called a generator function . Such a function, when called, always returns an iterator object which can be used to execute the body of the function: calling the iterator's iterator.__next__() method will cause the function to execute until it provides a value using the yield statement. When the function executes a return statement or falls off the end, a StopIteration exception is raised and the iterator will have reached the end of the set of values to be returned.

Coroutine functions

A function or method which is defined using async def is called a coroutine function . Such a function, when called, returns a coroutine object. It may contain await expressions, as well as async with and async for statements. See also the Coroutine Objects section.

Asynchronous generator functions

A function or method which is defined using async def and which uses the yield statement is called a asynchronous generator function . Such a function, when called, returns an asynchronous iterator object which can be used in an async for statement to execute the body of the function.

Calling the asynchronous iterator's aiterator.__anext__() method will return an awaitable which when awaited will execute until it provides a value using the yield expression. When the function executes an empty return statement or falls off the end, a StopAsyncIteration exception is raised and the asynchronous iterator will have reached the end of the set of values to be yielded.

Built-in functions

A built-in function object is a wrapper around a C function. Examples of built-in functions are len() and math.sin() ( math is a standard built-in module). The number and type of the arguments are determined by the C function. Special read-only attributes: __doc__ is the function's documentation string, or None if unavailable; __name__ is the function's name; __self__ is set to None (but see the next item); __module__ is the name of the module the function was defined in or None if unavailable.

Built-in methods

This is really a different disguise of a built-in function, this time containing an object passed to the C function as an implicit extra argument. An example of a built-in method is alist.append() , assuming alist is a list object. In this case, the special read-only attribute __self__ is set to the object denoted by alist .

Classes
Classes are callable. These objects normally act as factories for new instances of themselves, but variations are possible for class types that override __new__() . The arguments of the call are passed to __new__() and, in the typical case, to __init__() to initialize the new instance.
Class Instances
Instances of arbitrary classes can be made callable by defining a __call__() method in their class.
Modules

Modules are a basic organizational unit of Python code, and are created by the import system as invoked either by the import statement (see import ), or by calling functions such as importlib.import_module() and built-in __import__() . A module object has a namespace implemented by a dictionary object (this is the dictionary referenced by the __globals__ attribute of functions defined in the module). Attribute references are translated to lookups in this dictionary, e.g., m.x is equivalent to m.__dict__["x"] . A module object does not contain the code object used to initialize the module (since it isn't needed once the initialization is done).

Attribute assignment updates the module's namespace dictionary, e.g., m.x is equivalent to m.__dict__["x"] .

Predefined (writable) attributes: __name__ is the module's name; __doc__ is the module's documentation string, or None if unavailable; __annotations__ (optional) is a dictionary containing variable annotations collected during module body execution; __file__ is the pathname of the file from which the module was loaded, if it was loaded from a file. The __file__ attribute may be missing for certain types of modules, such as C modules that are statically linked into the interpreter; for extension modules loaded dynamically from a shared library, it is the pathname of the shared library file.

Special read-only attribute: __dict__ is the module's namespace as a dictionary object.

CPython implementation detail: Because of the way CPython clears module dictionaries, the module dictionary will be cleared when the module falls out of scope even if the dictionary still has live references. To avoid this, copy the dictionary or keep the module around while using its dictionary directly.
Custom classes

Custom class types are typically created by class definitions (see section Class definitions ). A class has a namespace implemented by a dictionary object. Class attribute references are translated to lookups in this dictionary, e.g., C.x is translated to C.__dict__["x"] (although there are a number of hooks which allow for other means of locating attributes). When the attribute name is not found there, the attribute search continues in the base classes. This search of the base classes uses the C3 method resolution order which behaves correctly even in the presence of 'diamond' inheritance structures where there are multiple inheritance paths leading back to a common ancestor. Additional details on the C3 MRO used by Python can be found in the documentation accompanying the 2.3 release at https://www.python.org/download/releases/2.3/mro/ .

When a class attribute reference (for class , say) would yield a class method object, it is transformed into an instance method object whose __self__ attributes is . When it would yield a static method object, it is transformed into the object wrapped by the static method object. See section Implementing Descriptors for another way in which attributes retrieved from a class may differ from those actually contained in its __dict__ .

Class attribute assignments update the class's dictionary, never the dictionary of a base class.

A class object can be called (see above) to yield a class instance (see below).

Special attributes: __name__ is the class name; __module__ is the module name in which the class was defined; __dict__ is the dictionary containing the class's namespace; __bases__ is a tuple containing the base classes, in the order of their occurrence in the base class list; __doc__ is the class's documentation string, or None if undefined; __annotations__ (optional) is a dictionary containing variable annotations collected during class body execution.

Class instances

A class instance is created by calling a class object (see above). A class instance has a namespace implemented as a dictionary which is the first place in which attribute references are searched. When an attribute is not found there, and the instance's class has an attribute by that name, the search continues with the class attributes. If a class attribute is found that is a user-defined function object, it is transformed into an instance method object whose __self__ attribute is the instance. Static method and class method objects are also transformed; see above under "Classes". See section Implementing Descriptors for another way in which attributes of a class retrieved via its instances may differ from the objects actually stored in the class's __dict__ . If no class attribute is found, and the object's class has a __getattr__() method, that is called to satisfy the lookup.

Attribute assignments and deletions update the instance's dictionary, never a class's dictionary. If the class has a __setattr__() or __delattr__() method, this is called instead of updating the instance dictionary directly.

Class instances can pretend to be numbers, sequences, or mappings if they have methods with certain special names. See section Special method names .

Special attributes: __dict__ is the attribute dictionary; __class__ is the instance's class.

I/O objects (also known as file objects)

A file object represents an open file. Various shortcuts are available to create file objects: the open() built-in function, and also os.popen() , os.fdopen() , and the makefile() method of socket objects (and perhaps by other functions or methods provided by extension modules).

The objects sys.stdin , sys.stdout and sys.stderr are initialized to file objects corresponding to the interpreter's standard input, output and error streams; they are all open in text mode and therefore follow the interface defined by the io.TextIOBase abstract class.

Internal types

A few types used internally by the interpreter are exposed to the user. Their definitions may change with future versions of the interpreter, but they are mentioned here for completeness.

Code objects

Code objects represent byte-compiled executable Python code, or bytecode . The difference between a code object and a function object is that the function object contains an explicit reference to the function's globals (the module in which it was defined), while a code object contains no context; also the default argument values are stored in the function object, not in the code object (because they represent values calculated at run-time). Unlike function objects, code objects are immutable and contain no references (directly or indirectly) to mutable objects.

Special read-only attributes: co_name gives the function name; co_argcount is the number of positional arguments (including arguments with default values); co_nlocals is the number of local variables used by the function (including arguments); co_varnames is a tuple containing the names of the local variables (starting with the argument names); co_cellvars is a tuple containing the names of local variables that are referenced by nested functions; co_freevars is a tuple containing the names of free variables; co_code is a string representing the sequence of bytecode instructions; co_consts is a tuple containing the literals used by the bytecode; co_names is a tuple containing the names used by the bytecode; co_filename is the filename from which the code was compiled; co_firstlineno is the first line number of the function; co_lnotab is a string encoding the mapping from bytecode offsets to line numbers (for details see the source code of the interpreter); co_stacksize is the required stack size (including local variables); co_flags is an integer encoding a number of flags for the interpreter.

The following flag bits are defined for co_flags : bit 0x04 is set if the function uses the *arguments syntax to accept an arbitrary number of positional arguments; bit 0x08 is set if the function uses the **keywords syntax to accept arbitrary keyword arguments; bit 0x20 is set if the function is a generator.

Future feature declarations ( from __future__ import division ) also use bits in co_flags to indicate whether a code object was compiled with a particular feature enabled: bit 0x2000 is set if the function was compiled with future division enabled; bits 0x10 and 0x1000 were used in earlier versions of Python.

Other bits in co_flags are reserved for internal use.

If a code object represents a function, the first item in co_consts is the documentation string of the function, or None if undefined.

Frame objects

Frame objects represent execution frames. They may occur in traceback objects (see below).

Special read-only attributes: f_back is to the previous stack frame (towards the caller), or None if this is the bottom stack frame; f_code is the code object being executed in this frame; f_locals is the dictionary used to look up local variables; f_globals is used for global variables; f_builtins is used for built-in (intrinsic) names; f_lasti gives the precise instruction (this is an index into the bytecode string of the code object).

Special writable attributes: f_trace , if not None , is a function called at the start of each source code line (this is used by the debugger); f_lineno is the current line number of the frame -- writing to this from within a trace function jumps to the given line (only for the bottom-most frame). A debugger can implement a Jump command (aka Set Next Statement) by writing to f_lineno.

Frame objects support one method:

frame. clear ()
This method clears all references to local variables held by the frame. Also, if the frame belonged to a generator, the generator is finalized. This helps break reference cycles involving frame objects (for example when catching an exception and storing its traceback for later use).

RuntimeError is raised if the frame is currently executing.

New in version 3.4.
Traceback objects

Traceback objects represent a stack trace of an exception. A traceback object is created when an exception occurs. When the search for an exception handler unwinds the execution stack, at each unwound level a traceback object is inserted in front of the current traceback. When an exception handler is entered, the stack trace is made available to the program. (See section The try statement .) It is accessible as the third item of the tuple returned by sys.exc_info() . When the program contains no suitable handler, the stack trace is written (nicely formatted) to the standard error stream; if the interpreter is interactive, it is also made available to the user as sys.last_traceback .

Special read-only attributes: tb_next is the next level in the stack trace (towards the frame where the exception occurred), or None if there is no next level; tb_frame points to the execution frame of the current level; tb_lineno gives the line number where the exception occurred; tb_lasti indicates the precise instruction. The line number and last instruction in the traceback may differ from the line number of its frame object if the exception occurred in a try statement with no matching except clause or with a finally clause.

Slice objects

Slice objects are used to represent slices for __getitem__() methods. They are also created by the built-in slice() function.

Special read-only attributes: start is the lower bound; stop is the upper bound; step is the step value; each is None if omitted. These attributes can have any type.

Slice objects support one method:

slice. indices self , length
This method takes a single integer argument length and computes information about the slice that the slice object would describe if applied to a sequence of length items. It returns a tuple of three integers; respectively these are the start and stop indices and the step or stride length of the slice. Missing or out-of-bounds indices are handled in a manner consistent with regular slices.
Static method objects
Static method objects provide a way of defeating the transformation of function objects to method objects described above. A static method object is a wrapper around any other object, usually a user-defined method object. When a static method object is retrieved from a class or a class instance, the object actually returned is the wrapped object, which is not subject to any further transformation. Static method objects are not themselves callable, although the objects they wrap usually are. Static method objects are created by the built-in staticmethod() constructor.
Class method objects
A class method object, like a static method object, is a wrapper around another object that alters the way in which that object is retrieved from classes and class instances. The behaviour of class method objects upon such retrieval is described above, under "User-defined methods". Class method objects are created by the built-in classmethod() constructor.
3.3. Special method names

A class can implement certain operations that are invoked by special syntax (such as arithmetic operations or subscripting and slicing) by defining methods with special names. This is Python's approach to operator overloading , allowing classes to define their own behavior with respect to language operators. For instance, if a class defines a method named __getitem__() , and is an instance of this class, then x[i] is roughly equivalent to type(x).__getitem__(x, i) . Except where mentioned, attempts to execute an operation raise an exception when no appropriate method is defined (typically AttributeError or TypeError ).

Setting a special method to None indicates that the corresponding operation is not available. For example, if a class sets __iter__() to None , the class is not iterable, so calling iter() on its instances will raise a TypeError (without falling back to __getitem__() ). [2]

When implementing a class that emulates any built-in type, it is important that the emulation only be implemented to the degree that it makes sense for the object being modelled. For example, some sequences may work well with retrieval of individual elements, but extracting a slice may not make sense. (One example of this is the NodeList interface in the W3C's Document Object Model.)

3.3.1. Basic customization
object. __new__ cls , ...

Called to create a new instance of class cls . __new__() is a static method (special-cased so you need not declare it as such) that takes the class of which an instance was requested as its first argument. The remaining arguments are those passed to the object constructor expression (the call to the class). The return value of __new__() should be the new object instance (usually an instance of cls ).

Typical implementations create a new instance of the class by invoking the superclass's __new__() method using super().__new__(cls[, ...]) with appropriate arguments and then modifying the newly-created instance as necessary before returning it.

If __new__() returns an instance of cls , then the new instance's __init__() method will be invoked like __init__(self[, ...]) , where self is the new instance and the remaining arguments are the same as were passed to __new__() .

If __new__() does not return an instance of cls , then the new instance's __init__() method will not be invoked.

__new__() is intended mainly to allow subclasses of immutable types (like int, str, or tuple) to customize instance creation. It is also commonly overridden in custom metaclasses in order to customize class creation.

object. __init__ self , ...

Called after the instance has been created (by __new__() ), but before it is returned to the caller. The arguments are those passed to the class constructor expression. If a base class has an __init__() method, the derived class's __init__() method, if any, must explicitly call it to ensure proper initialization of the base class part of the instance; for example: super().__init__([args...]) .

Because __new__() and __init__() work together in constructing objects ( __new__() to create it, and __init__() to customize it), no non- None value may be returned by __init__() ; doing so will cause a TypeError to be raised at runtime.

object. __del__ self

Called when the instance is about to be destroyed. This is also called a destructor. If a base class has a __del__() method, the derived class's __del__() method, if any, must explicitly call it to ensure proper deletion of the base class part of the instance. Note that it is possible (though not recommended!) for the __del__() method to postpone destruction of the instance by creating a new reference to it. It may then be called at a later time when this new reference is deleted. It is not guaranteed that __del__() methods are called for objects that still exist when the interpreter exits.

Note

del doesn't directly call x.__del__() -- the former decrements the reference count for by one, and the latter is only called when 's reference count reaches zero. Some common situations that may prevent the reference count of an object from going to zero include: circular references between objects (e.g., a doubly-linked list or a tree data structure with parent and child pointers); a reference to the object on the stack frame of a function that caught an exception (the traceback stored in sys.exc_info()[2] keeps the stack frame alive); or a reference to the object on the stack frame that raised an unhandled exception in interactive mode (the traceback stored in sys.last_traceback keeps the stack frame alive). The first situation can only be remedied by explicitly breaking the cycles; the second can be resolved by freeing the reference to the traceback object when it is no longer useful, and the third can be resolved by storing None in sys.last_traceback . Circular references which are garbage are detected and cleaned up when the cyclic garbage collector is enabled (it's on by default). Refer to the documentation for the gc module for more information about this topic.

Warning

Due to the precarious circumstances under which __del__() methods are invoked, exceptions that occur during their execution are ignored, and a warning is printed to sys.stderr instead. Also, when __del__() is invoked in response to a module being deleted (e.g., when execution of the program is done), other globals referenced by the __del__() method may already have been deleted or in the process of being torn down (e.g. the import machinery shutting down). For this reason, __del__() methods should do the absolute minimum needed to maintain external invariants. Starting with version 1.5, Python guarantees that globals whose name begins with a single underscore are deleted from their module before other globals are deleted; if no other references to such globals exist, this may help in assuring that imported modules are still available at the time when the __del__() method is called.

object. __repr__ self
Called by the repr() built-in function to compute the "official" string representation of an object. If at all possible, this should look like a valid Python expression that could be used to recreate an object with the same value (given an appropriate environment). If this is not possible, a string of the form <...some useful description...> should be returned. The return value must be a string object. If a class defines __repr__() but not __str__() , then __repr__() is also used when an "informal" string representation of instances of that class is required.

This is typically used for debugging, so it is important that the representation is information-rich and unambiguous.

object. __str__ self
Called by str(object) and the built-in functions format() and print() to compute the "informal" or nicely printable string representation of an object. The return value must be a string object.

This method differs from object.__repr__() in that there is no expectation that __str__() return a valid Python expression: a more convenient or concise representation can be used.

The default implementation defined by the built-in type object calls object.__repr__() .

object. __bytes__ self

Called by bytes to compute a byte-string representation of an object. This should return a bytes object.

object. __format__ self , format_spec
Called by the format() built-in function, and by extension, evaluation of formatted string literals and the str.format() method, to produce a "formatted" string representation of an object. The format_spec argument is a string that contains a description of the formatting options desired. The interpretation of the format_spec argument is up to the type implementing __format__() , however most classes will either delegate formatting to one of the built-in types, or use a similar formatting option syntax.

See Format Specification Mini-Language for a description of the standard formatting syntax.

The return value must be a string object.

Changed in version 3.4: The __format__ method of object itself raises a TypeError if passed any non-empty string.
object. __lt__ self , other
object. __le__ self , other
object. __eq__ self , other
object. __ne__ self , other
object. __gt__ self , other
object. __ge__ self , other

These are the so-called "rich comparison" methods. The correspondence between operator symbols and method names is as follows: x<y calls x.__lt__(y) , x<=y calls x.__le__(y) , x==y calls x.__eq__(y) , x!=y calls x.__ne__(y) , x>y calls x.__gt__(y) , and x>=y calls x.__ge__(y) .

A rich comparison method may return the singleton NotImplemented if it does not implement the operation for a given pair of arguments. By convention, False and True are returned for a successful comparison. However, these methods can return any value, so if the comparison operator is used in a Boolean context (e.g., in the condition of an if statement), Python will call bool() on the value to determine if the result is true or false.

By default, __ne__() delegates to __eq__() and inverts the result unless it is NotImplemented . There are no other implied relationships among the comparison operators, for example, the truth of (x<y or x==y) does not imply x<=y . To automatically generate ordering operations from a single root operation, see functools.total_ordering() .

See the paragraph on __hash__() for some important notes on creating hashable objects which support custom comparison operations and are usable as dictionary keys.

There are no swapped-argument versions of these methods (to be used when the left argument does not support the operation but the right argument does); rather, __lt__() and __gt__() are each other's reflection, __le__() and __ge__() are each other's reflection, and __eq__() and __ne__() are their own reflection. If the operands are of different types, and right operand's type is a direct or indirect subclass of the left operand's type, the reflected method of the right operand has priority, otherwise the left operand's method has priority. Virtual subclassing is not considered.

object. __hash__ self

Called by built-in function hash() and for operations on members of hashed collections including set , frozenset , and dict . __hash__() should return an integer. The only required property is that objects which compare equal have the same hash value; it is advised to mix together the hash values of the components of the object that also play a part in comparison of objects by packing them into a tuple and hashing the tuple. Example:

def __hash__(self):
    return hash((self.name, self.nick, self.color))

Note

hash() truncates the value returned from an object's custom __hash__() method to the size of a Py_ssize_t . This is typically 8 bytes on 64-bit builds and 4 bytes on 32-bit builds. If an object's __hash__() must interoperate on builds of different bit sizes, be sure to check the width on all supported builds. An easy way to do this is with python -c "import sys; print(sys.hash_info.width)" .

If a class does not define an __eq__() method it should not define a __hash__() operation either; if it defines __eq__() but not __hash__() , its instances will not be usable as items in hashable collections. If a class defines mutable objects and implements an __eq__() method, it should not implement __hash__() , since the implementation of hashable collections requires that a key's hash value is immutable (if the object's hash value changes, it will be in the wrong hash bucket).

User-defined classes have __eq__() and __hash__() methods by default; with them, all objects compare unequal (except with themselves) and x.__hash__() returns an appropriate value such that == implies both that is and hash(x) == hash(y) .

A class that overrides __eq__() and does not define __hash__() will have its __hash__() implicitly set to None . When the __hash__() method of a class is None , instances of the class will raise an appropriate TypeError when a program attempts to retrieve their hash value, and will also be correctly identified as unhashable when checking isinstance(obj, collections.Hashable) .

If a class that overrides __eq__() needs to retain the implementation of __hash__() from a parent class, the interpreter must be told this explicitly by setting __hash__ <ParentClass>.__hash__ .

If a class that does not override __eq__() wishes to suppress hash support, it should include __hash__ None in the class definition. A class which defines its own __hash__() that explicitly raises a TypeError would be incorrectly identified as hashable by an isinstance(obj, collections.Hashable) call.

Note

By default, the __hash__() values of str, bytes and datetime objects are "salted" with an unpredictable random value. Although they remain constant within an individual Python process, they are not predictable between repeated invocations of Python.

This is intended to provide protection against a denial-of-service caused by carefully-chosen inputs that exploit the worst case performance of a dict insertion, O(n^2) complexity. See http://www.ocert.org/advisories/ocert-2011-003.html for details.

Changing hash values affects the iteration order of dicts, sets and other mappings. Python has never made guarantees about this ordering (and it typically varies between 32-bit and 64-bit builds).

See also PYTHONHASHSEED . Changed in version 3.3: Hash randomization is enabled by default.

object. __bool__ self

Called to implement truth value testing and the built-in operation bool() ; should return False or True . When this method is not defined, __len__() is called, if it is defined, and the object is considered true if its result is nonzero. If a class defines neither __len__() nor __bool__() , all its instances are considered true.

3.3.2. Customizing attribute access

The following methods can be defined to customize the meaning of attribute access (use of, assignment to, or deletion of x.name ) for class instances.

object. __getattr__ self , name
Called when an attribute lookup has not found the attribute in the usual places (i.e. it is not an instance attribute nor is it found in the class tree for self ). name is the attribute name. This method should return the (computed) attribute value or raise an AttributeError exception.

Note that if the attribute is found through the normal mechanism, __getattr__() is not called. (This is an intentional asymmetry between __getattr__() and __setattr__() .) This is done both for efficiency reasons and because otherwise __getattr__() would have no way to access other attributes of the instance. Note that at least for instance variables, you can fake total control by not inserting any values in the instance attribute dictionary (but instead inserting them in another object). See the __getattribute__() method below for a way to actually get total control over attribute access.

object. __getattribute__ self , name
Called unconditionally to implement attribute accesses for instances of the class. If the class also defines __getattr__() , the latter will not be called unless __getattribute__() either calls it explicitly or raises an AttributeError . This method should return the (computed) attribute value or raise an AttributeError exception. In order to avoid infinite recursion in this method, its implementation should always call the base class method with the same name to access any attributes it needs, for example, object.__getattribute__(self, name) .

Note

This method may still be bypassed when looking up special methods as the result of implicit invocation via language syntax or built-in functions. See Special method lookup .

object. __setattr__ self , name , value
Called when an attribute assignment is attempted. This is called instead of the normal mechanism (i.e. store the value in the instance dictionary). name is the attribute name, value is the value to be assigned to it.

If __setattr__() wants to assign to an instance attribute, it should call the base class method with the same name, for example, object.__setattr__(self, name, value) .

object. __delattr__ self , name
Like __setattr__() but for attribute deletion instead of assignment. This should only be implemented if del obj.name is meaningful for the object.
object. __dir__ self
Called when dir() is called on the object. A sequence must be returned. dir() converts the returned sequence to a list and sorts it.
3.3.2.1. Implementing Descriptors

The following methods only apply when an instance of the class containing the method (a so-called descriptor class) appears in an owner class (the descriptor must be in either the owner's class dictionary or in the class dictionary for one of its parents). In the examples below, "the attribute" refers to the attribute whose name is the key of the property in the owner class' __dict__ .

object. __get__ self , instance , owner
Called to get the attribute of the owner class (class attribute access) or of an instance of that class (instance attribute access). owner is always the owner class, while instance is the instance that the attribute was accessed through, or None when the attribute is accessed through the owner . This method should return the (computed) attribute value or raise an AttributeError exception.
object. __set__ self , instance , value
Called to set the attribute on an instance instance of the owner class to a new value, value .
object. __delete__ self , instance
Called to delete the attribute on an instance instance of the owner class.
object. __set_name__ self , owner , name
Called at the time the owning class owner is created. The descriptor has been assigned to name . New in version 3.6.

The attribute __objclass__ is interpreted by the inspect module as specifying the class where this object was defined (setting this appropriately can assist in runtime introspection of dynamic class attributes). For callables, it may indicate that an instance of the given type (or a subclass) is expected or required as the first positional argument (for example, CPython sets this attribute for unbound methods that are implemented in C). 3.3.2.2. Invoking Descriptors

In general, a descriptor is an object attribute with "binding behavior", one whose attribute access has been overridden by methods in the descriptor protocol: __get__() , __set__() , and __delete__() . If any of those methods are defined for an object, it is said to be a descriptor.

The default behavior for attribute access is to get, set, or delete the attribute from an object's dictionary. For instance, a.x has a lookup chain starting with a.__dict__['x'] , then type(a).__dict__['x'] , and continuing through the base classes of type(a) excluding metaclasses.

However, if the looked-up value is an object defining one of the descriptor methods, then Python may override the default behavior and invoke the descriptor method instead. Where this occurs in the precedence chain depends on which descriptor methods were defined and how they were called.

The starting point for descriptor invocation is a binding, a.x . How the arguments are assembled depends on :

Direct Call
The simplest and least common call is when user code directly invokes a descriptor method: x.__get__(a) .
Instance Binding
If binding to an object instance, a.x is transformed into the call: type(a).__dict__['x'].__get__(a, type(a)) .
Class Binding
If binding to a class, A.x is transformed into the call: A.__dict__['x'].__get__(None, A) .
Super Binding
If is an instance of super , then the binding super(B, obj).m() searches obj.__class__.__mro__ for the base class immediately preceding and then invokes the descriptor with the call: A.__dict__['m'].__get__(obj, obj.__class__) .

For instance bindings, the precedence of descriptor invocation depends on the which descriptor methods are defined. A descriptor can define any combination of __get__() , __set__() and __delete__() . If it does not define __get__() , then accessing the attribute will return the descriptor object itself unless there is a value in the object's instance dictionary. If the descriptor defines __set__() and/or __delete__() , it is a data descriptor; if it defines neither, it is a non-data descriptor. Normally, data descriptors define both __get__() and __set__() , while non-data descriptors have just the __get__() method. Data descriptors with __set__() and __get__() defined always override a redefinition in an instance dictionary. In contrast, non-data descriptors can be overridden by instances.

Python methods (including staticmethod() and classmethod() ) are implemented as non-data descriptors. Accordingly, instances can redefine and override methods. This allows individual instances to acquire behaviors that differ from other instances of the same class.

The property() function is implemented as a data descriptor. Accordingly, instances cannot override the behavior of a property. 3.3.2.3. __slots__

By default, instances of classes have a dictionary for attribute storage. This wastes space for objects having very few instance variables. The space consumption can become acute when creating large numbers of instances.

The default can be overridden by defining __slots__ in a class definition. The __slots__ declaration takes a sequence of instance variables and reserves just enough space in each instance to hold a value for each variable. Space is saved because __dict__ is not created for each instance.

object. __slots__
This class variable can be assigned a string, iterable, or sequence of strings with variable names used by instances. __slots__ reserves space for the declared variables and prevents the automatic creation of __dict__ and __weakref__ for each instance.
3.3.2.3.1. Notes on using __slots__
  • When inheriting from a class without __slots__ , the __dict__ attribute of that class will always be accessible, so a __slots__ definition in the subclass is meaningless.
  • Without a __dict__ variable, instances cannot be assigned new variables not listed in the __slots__ definition. Attempts to assign to an unlisted variable name raises AttributeError . If dynamic assignment of new variables is desired, then add '__dict__' to the sequence of strings in the __slots__ declaration.
  • Without a __weakref__ variable for each instance, classes defining __slots__ do not support weak references to its instances. If weak reference support is needed, then add '__weakref__' to the sequence of strings in the __slots__ declaration.
  • __slots__ are implemented at the class level by creating descriptors ( Implementing Descriptors ) for each variable name. As a result, class attributes cannot be used to set default values for instance variables defined by __slots__ ; otherwise, the class attribute would overwrite the descriptor assignment.
  • The action of a __slots__ declaration is limited to the class where it is defined. As a result, subclasses will have a __dict__ unless they also define __slots__ (which must only contain names of any additional slots).
  • If a class defines a slot also defined in a base class, the instance variable defined by the base class slot is inaccessible (except by retrieving its descriptor directly from the base class). This renders the meaning of the program undefined. In the future, a check may be added to prevent this.
  • Nonempty __slots__ does not work for classes derived from "variable-length" built-in types such as int , bytes and tuple .
  • Any non-string iterable may be assigned to __slots__ . Mappings may also be used; however, in the future, special meaning may be assigned to the values corresponding to each key.
  • __class__ assignment works only if both classes have the same __slots__ .
3.3.3. Customizing class creation

Whenever a class inherits from another class, __init_subclass__ is called on that class. This way, it is possible to write classes which change the behavior of subclasses. This is closely related to class decorators, but where class decorators only affect the specific class they're applied to, __init_subclass__ solely applies to future subclasses of the class defining the method.

classmethod object. __init_subclass__ cls
This method is called whenever the containing class is subclassed. cls is then the new subclass. If defined as a normal instance method, this method is implicitly converted to a class method.

Keyword arguments which are given to a new class are passed to the parent's class __init_subclass__ . For compatibility with other classes using __init_subclass__ , one should take out the needed keyword arguments and pass the others over to the base class, as in:

class Philosopher:
    def __init_subclass__(cls, default_name, **kwargs):
        super().__init_subclass__(**kwargs)
        cls.default_name = default_name

class AustralianPhilosopher(Philosopher, default_name="Bruce"):
    pass

The default implementation object.__init_subclass__ does nothing, but raises an error if it is called with any arguments.

Note

The metaclass hint metaclass is consumed by the rest of the type machinery, and is never passed to __init_subclass__ implementations. The actual metaclass (rather than the explicit hint) can be accessed as type(cls) . New in version 3.6.

3.3.3.1. Metaclasses

By default, classes are constructed using type() . The class body is executed in a new namespace and the class name is bound locally to the result of type(name, bases, namespace) .

The class creation process can be customized by passing the metaclass keyword argument in the class definition line, or by inheriting from an existing class that included such an argument. In the following example, both MyClass and MySubclass are instances of Meta :

class Meta(type):
    pass

class MyClass(metaclass=Meta):
    pass

class MySubclass(MyClass):
    pass

Any other keyword arguments that are specified in the class definition are passed through to all metaclass operations described below.

When a class definition is executed, the following steps occur:

  • the appropriate metaclass is determined
  • the class namespace is prepared
  • the class body is executed
  • the class object is created
3.3.3.2. Determining the appropriate metaclass

The appropriate metaclass for a class definition is determined as follows:

  • if no bases and no explicit metaclass are given, then type() is used
  • if an explicit metaclass is given and it is not an instance of type() , then it is used directly as the metaclass
  • if an instance of type() is given as the explicit metaclass, or bases are defined, then the most derived metaclass is used

The most derived metaclass is selected from the explicitly specified metaclass (if any) and the metaclasses (i.e. type(cls) ) of all specified base classes. The most derived metaclass is one which is a subtype of all of these candidate metaclasses. If none of the candidate metaclasses meets that criterion, then the class definition will fail with TypeError . 3.3.3.3. Preparing the class namespace

Once the appropriate metaclass has been identified, then the class namespace is prepared. If the metaclass has a __prepare__ attribute, it is called as namespace metaclass.__prepare__(name, bases, **kwds) (where the additional keyword arguments, if any, come from the class definition).

If the metaclass has no __prepare__ attribute, then the class namespace is initialised as an empty ordered mapping.

See also

PEP 3115 - Metaclasses in Python 3000
Introduced the __prepare__ namespace hook
3.3.3.4. Executing the class body

The class body is executed (approximately) as exec(body, globals(), namespace) . The key difference from a normal call to exec() is that lexical scoping allows the class body (including any methods) to reference names from the current and outer scopes when the class definition occurs inside a function.

However, even when the class definition occurs inside the function, methods defined inside the class still cannot see names defined at the class scope. Class variables must be accessed through the first parameter of instance or class methods, or through the implicit lexically scoped __class__ reference described in the next section. 3.3.3.5. Creating the class object

Once the class namespace has been populated by executing the class body, the class object is created by calling metaclass(name, bases, namespace, **kwds) (the additional keywords passed here are the same as those passed to __prepare__ ).

This class object is the one that will be referenced by the zero-argument form of super() . __class__ is an implicit closure reference created by the compiler if any methods in a class body refer to either __class__ or super . This allows the zero argument form of super() to correctly identify the class being defined based on lexical scoping, while the class or instance that was used to make the current call is identified based on the first argument passed to the method.

CPython implementation detail: In CPython 3.6 and later, the __class__ cell is passed to the metaclass as a __classcell__ entry in the class namespace. If present, this must be propagated up to the type.__new__ call in order for the class to be initialised correctly. Failing to do so will result in a DeprecationWarning in Python 3.6, and a RuntimeWarning in the future.

When using the default metaclass type , or any metaclass that ultimately calls type.__new__ , the following additional customisation steps are invoked after creating the class object:

  • first, type.__new__ collects all of the descriptors in the class namespace that define a __set_name__() method;
  • second, all of these __set_name__ methods are called with the class being defined and the assigned name of that particular descriptor; and
  • finally, the __init_subclass__() hook is called on the immediate parent of the new class in its method resolution order.

After the class object is created, it is passed to the class decorators included in the class definition (if any) and the resulting object is bound in the local namespace as the defined class.

When a new class is created by type.__new__ , the object provided as the namespace parameter is copied to a new ordered mapping and the original object is discarded. The new copy is wrapped in a read-only proxy, which becomes the __dict__ attribute of the class object.

See also

PEP 3135 - New super
Describes the implicit __class__ closure reference
3.3.3.6. Metaclass example

The potential uses for metaclasses are boundless. Some ideas that have been explored include enum, logging, interface checking, automatic delegation, automatic property creation, proxies, frameworks, and automatic resource locking/synchronization.

Here is an example of a metaclass that uses an collections.OrderedDict to remember the order that class variables are defined:

class OrderedClass(type):

    @classmethod
    def __prepare__(metacls, name, bases, **kwds):
        return collections.OrderedDict()

    def __new__(cls, name, bases, namespace, **kwds):
        result = type.__new__(cls, name, bases, dict(namespace))
        result.members = tuple(namespace)
        return result

class A(metaclass=OrderedClass):
    def one(self): pass
    def two(self): pass
    def three(self): pass
    def four(self): pass

>>> A.members
('__module__', 'one', 'two', 'three', 'four')

When the class definition for A gets executed, the process begins with calling the metaclass's __prepare__() method which returns an empty collections.OrderedDict . That mapping records the methods and attributes of A as they are defined within the body of the class statement. Once those definitions are executed, the ordered dictionary is fully populated and the metaclass's __new__() method gets invoked. That method builds the new type and it saves the ordered dictionary keys in an attribute called members . 3.3.4. Customizing instance and subclass checks

The following methods are used to override the default behavior of the isinstance() and issubclass() built-in functions.

In particular, the metaclass abc.ABCMeta implements these methods in order to allow the addition of Abstract Base Classes (ABCs) as "virtual base classes" to any class or type (including built-in types), including other ABCs.

class. __instancecheck__ self , instance
Return true if instance should be considered a (direct or indirect) instance of class . If defined, called to implement isinstance(instance, class) .
class. __subclasscheck__ self , subclass
Return true if subclass should be considered a (direct or indirect) subclass of class . If defined, called to implement issubclass(subclass, class) .

Note that these methods are looked up on the type (metaclass) of a class. They cannot be defined as class methods in the actual class. This is consistent with the lookup of special methods that are called on instances, only in this case the instance is itself a class.

See also

PEP 3119 - Introducing Abstract Base Classes
Includes the specification for customizing isinstance() and issubclass() behavior through __instancecheck__() and __subclasscheck__() , with motivation for this functionality in the context of adding Abstract Base Classes (see the abc module) to the language.
3.3.5. Emulating callable objects
object. __call__ self , args...

Called when the instance is "called" as a function; if this method is defined, x(arg1, arg2, ...) is a shorthand for x.__call__(arg1, arg2, ...) .

3.3.6. Emulating container types

The following methods can be defined to implement container objects. Containers usually are sequences (such as lists or tuples) or mappings (like dictionaries), but can represent other containers as well. The first set of methods is used either to emulate a sequence or to emulate a mapping; the difference is that for a sequence, the allowable keys should be the integers k for which <= < where N is the length of the sequence, or slice objects, which define a range of items. It is also recommended that mappings provide the methods keys() , values() , items() , get() , clear() , setdefault() , pop() , popitem() , copy() , and update() behaving similar to those for Python's standard dictionary objects. The collections module provides a MutableMapping abstract base class to help create those methods from a base set of __getitem__() , __setitem__() , __delitem__() , and keys() . Mutable sequences should provide methods append() , count() , index() , extend() , insert() , pop() , remove() , reverse() and sort() , like Python standard list objects. Finally, sequence types should implement addition (meaning concatenation) and multiplication (meaning repetition) by defining the methods __add__() , __radd__() , __iadd__() , __mul__() , __rmul__() and __imul__() described below; they should not define other numerical operators. It is recommended that both mappings and sequences implement the __contains__() method to allow efficient use of the in operator; for mappings, in should search the mapping's keys; for sequences, it should search through the values. It is further recommended that both mappings and sequences implement the __iter__() method to allow efficient iteration through the container; for mappings, __iter__() should be the same as keys() ; for sequences, it should iterate through the values.

object. __len__ self

Called to implement the built-in function len() . Should return the length of the object, an integer >= 0. Also, an object that doesn't define a __bool__() method and whose __len__() method returns zero is considered to be false in a Boolean context.

CPython implementation detail: In CPython, the length is required to be at most sys.maxsize . If the length is larger than sys.maxsize some features (such as len() ) may raise OverflowError . To prevent raising OverflowError by truth value testing, an object must define a __bool__() method.
object. __length_hint__ self
Called to implement operator.length_hint() . Should return an estimated length for the object (which may be greater or less than the actual length). The length must be an integer >= 0. This method is purely an optimization and is never required for correctness. New in version 3.4.

Note

Slicing is done exclusively with the following three methods. A call like

a[1:2] = b

is translated to

a[slice(1, 2, None)] = b

and so forth. Missing slice items are always filled in with None .

object. __getitem__ self , key

Called to implement evaluation of self[key] . For sequence types, the accepted keys should be integers and slice objects. Note that the special interpretation of negative indexes (if the class wishes to emulate a sequence type) is up to the __getitem__() method. If key is of an inappropriate type, TypeError may be raised; if of a value outside the set of indexes for the sequence (after any special interpretation of negative values), IndexError should be raised. For mapping types, if key is missing (not in the container), KeyError should be raised.

Note

for loops expect that an IndexError will be raised for illegal indexes to allow proper detection of the end of the sequence.

object. __missing__ self , key
Called by dict . __getitem__() to implement self[key] for dict subclasses when key is not in the dictionary.
object. __setitem__ self , key , value
Called to implement assignment to self[key] . Same note as for __getitem__() . This should only be implemented for mappings if the objects support changes to the values for keys, or if new keys can be added, or for sequences if elements can be replaced. The same exceptions should be raised for improper key values as for the __getitem__() method.
object. __delitem__ self , key
Called to implement deletion of self[key] . Same note as for __getitem__() . This should only be implemented for mappings if the objects support removal of keys, or for sequences if elements can be removed from the sequence. The same exceptions should be raised for improper key values as for the __getitem__() method.
object. __iter__ self
This method is called when an iterator is required for a container. This method should return a new iterator object that can iterate over all the objects in the container. For mappings, it should iterate over the keys of the container.

Iterator objects also need to implement this method; they are required to return themselves. For more information on iterator objects, see Iterator Types .

object. __reversed__ self
Called (if present) by the reversed() built-in to implement reverse iteration. It should return a new iterator object that iterates over all the objects in the container in reverse order.

If the __reversed__() method is not provided, the reversed() built-in will fall back to using the sequence protocol ( __len__() and __getitem__() ). Objects that support the sequence protocol should only provide __reversed__() if they can provide an implementation that is more efficient than the one provided by reversed() .

The membership test operators ( in and not in ) are normally implemented as an iteration through a sequence. However, container objects can supply the following special method with a more efficient implementation, which also does not require the object be a sequence.

object. __contains__ self , item
Called to implement membership test operators. Should return true if item is in self , false otherwise. For mapping objects, this should consider the keys of the mapping rather than the values or the key-item pairs.

For objects that don't define __contains__() , the membership test first tries iteration via __iter__() , then the old sequence iteration protocol via __getitem__() , see this section in the language reference .

3.3.7. Emulating numeric types

The following methods can be defined to emulate numeric objects. Methods corresponding to operations that are not supported by the particular kind of number implemented (e.g., bitwise operations for non-integral numbers) should be left undefined.

object. __add__ self , other
object. __sub__ self , other
object. __mul__ self , other
object. __matmul__ self , other
object. __truediv__ self , other
object. __floordiv__ self , other
object. __mod__ self , other
object. __divmod__ self , other
object. __pow__ self , other modulo
object. __lshift__ self , other
object. __rshift__ self , other
object. __and__ self , other
object. __xor__ self , other
object. __or__ self , other

These methods are called to implement the binary arithmetic operations ( , , , , , // , , divmod() , pow() , ** , << , >> , & , , ). For instance, to evaluate the expression , where x is an instance of a class that has an __add__() method, x.__add__(y) is called. The __divmod__() method should be the equivalent to using __floordiv__() and __mod__() ; it should not be related to __truediv__() . Note that __pow__() should be defined to accept an optional third argument if the ternary version of the built-in pow() function is to be supported.

If one of those methods does not support the operation with the supplied arguments, it should return NotImplemented .

object. __radd__ self , other
object. __rsub__ self , other
object. __rmul__ self , other
object. __rmatmul__ self , other
object. __rtruediv__ self , other
object. __rfloordiv__ self , other
object. __rmod__ self , other
object. __rdivmod__ self , other
object. __rpow__ self , other
object. __rlshift__ self , other
object. __rrshift__ self , other
object. __rand__ self , other
object. __rxor__ self , other
object. __ror__ self , other

These methods are called to implement the binary arithmetic operations ( , , , , , // , , divmod() , pow() , ** , << , >> , & , , ) with reflected (swapped) operands. These functions are only called if the left operand does not support the corresponding operation [3] and the operands are of different types. [4] For instance, to evaluate the expression , where y is an instance of a class that has an __rsub__() method, y.__rsub__(x) is called if x.__sub__(y) returns NotImplemented .

Note that ternary pow() will not try calling __rpow__() (the coercion rules would become too complicated).

Note

If the right operand's type is a subclass of the left operand's type and that subclass provides the reflected method for the operation, this method will be called before the left operand's non-reflected method. This behavior allows subclasses to override their ancestors' operations.

object. __iadd__ self , other
object. __isub__ self , other
object. __imul__ self , other
object. __imatmul__ self , other
object. __itruediv__ self , other
object. __ifloordiv__ self , other
object. __imod__ self , other
object. __ipow__ self , other modulo
object. __ilshift__ self , other
object. __irshift__ self , other
object. __iand__ self , other
object. __ixor__ self , other
object. __ior__ self , other
These methods are called to implement the augmented arithmetic assignments ( += , -= , *= , @= , /= , //= , %= , **= , <<= , >>= , &= , ^= , |= ). These methods should attempt to do the operation in-place (modifying self ) and return the result (which could be, but does not have to be, self ). If a specific method is not defined, the augmented assignment falls back to the normal methods. For instance, if x is an instance of a class with an __iadd__() method, += is equivalent to x.__iadd__(y) . Otherwise, x.__add__(y) and y.__radd__(x) are considered, as with the evaluation of . In certain situations, augmented assignment can result in unexpected errors (see Why does a_tuple[i] += ['item'] raise an exception when the addition works? ), but this behavior is in fact part of the data model.
object. __neg__ self
object. __pos__ self
object. __abs__ self
object. __invert__ self

Called to implement the unary arithmetic operations ( , , abs() and ).

object. __complex__ self
object. __int__ self
object. __float__ self
object. __round__ self , n

Called to implement the built-in functions complex() , int() , float() and round() . Should return a value of the appropriate type.

object. __index__ self
Called to implement operator.index() , and whenever Python needs to losslessly convert the numeric object to an integer object (such as in slicing, or in the built-in bin() , hex() and oct() functions). Presence of this method indicates that the numeric object is an integer type. Must return an integer.

Note

In order to have a coherent integer type class, when __index__() is defined __int__() should also be defined, and both should return the same value.

3.3.8. With Statement Context Managers

A context manager is an object that defines the runtime context to be established when executing a with statement. The context manager handles the entry into, and the exit from, the desired runtime context for the execution of the block of code. Context managers are normally invoked using the with statement (described in section The with statement ), but can also be used by directly invoking their methods.

Typical uses of context managers include saving and restoring various kinds of global state, locking and unlocking resources, closing opened files, etc.

For more information on context managers, see Context Manager Types .

object. __enter__ self
Enter the runtime context related to this object. The with statement will bind this method's return value to the target(s) specified in the as clause of the statement, if any.
object. __exit__ self , exc_type , exc_value , traceback
Exit the runtime context related to this object. The parameters describe the exception that caused the context to be exited. If the context was exited without an exception, all three arguments will be None .

If an exception is supplied, and the method wishes to suppress the exception (i.e., prevent it from being propagated), it should return a true value. Otherwise, the exception will be processed normally upon exit from this method.

Note that __exit__() methods should not reraise the passed-in exception; this is the caller's responsibility.

See also

PEP 343 - The "with" statement
The specification, background, and examples for the Python with statement.
3.3.9. Special method lookup

For custom classes, implicit invocations of special methods are only guaranteed to work correctly if defined on an object's type, not in the object's instance dictionary. That behaviour is the reason why the following code raises an exception:

>>>
>>> class C:
...     pass
...
>>> c = C()
>>> c.__len__ = lambda: 5
>>> len(c)
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
TypeError: object of type 'C' has no len()

The rationale behind this behaviour lies with a number of special methods such as __hash__() and __repr__() that are implemented by all objects, including type objects. If the implicit lookup of these methods used the conventional lookup process, they would fail when invoked on the type object itself:

>>>
>>> 1 .__hash__() == hash(1)
True
>>> int.__hash__() == hash(int)
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
TypeError: descriptor '__hash__' of 'int' object needs an argument

Incorrectly attempting to invoke an unbound method of a class in this way is sometimes referred to as 'metaclass confusion', and is avoided by bypassing the instance when looking up special methods:

>>>
>>> type(1).__hash__(1) == hash(1)
True
>>> type(int).__hash__(int) == hash(int)
True

In addition to bypassing any instance attributes in the interest of correctness, implicit special method lookup generally also bypasses the __getattribute__() method even of the object's metaclass:

>>>
>>> class Meta(type):
...     def __getattribute__(*args):
...         print("Metaclass getattribute invoked")
...         return type.__getattribute__(*args)
...
>>> class C(object, metaclass=Meta):
...     def __len__(self):
...         return 10
...     def __getattribute__(*args):
...         print("Class getattribute invoked")
...         return object.__getattribute__(*args)
...
>>> c = C()
>>> c.__len__()                 # Explicit lookup via instance
Class getattribute invoked
10
>>> type(c).__len__(c)          # Explicit lookup via type
Metaclass getattribute invoked
10
>>> len(c)                      # Implicit lookup
10

Bypassing the __getattribute__() machinery in this fashion provides significant scope for speed optimisations within the interpreter, at the cost of some flexibility in the handling of special methods (the special method must be set on the class object itself in order to be consistently invoked by the interpreter). 3.4. Coroutines 3.4.1. Awaitable Objects

An awaitable object generally implements an __await__() method. Coroutine objects returned from async def functions are awaitable.

Note

The generator iterator objects returned from generators decorated with types.coroutine() or asyncio.coroutine() are also awaitable, but they do not implement __await__() .

object. __await__ self
Must return an iterator . Should be used to implement awaitable objects. For instance, asyncio.Future implements this method to be compatible with the await expression.
New in version 3.5.

See also

PEP 492 for additional information about awaitable objects. 3.4.2. Coroutine Objects

Coroutine objects are awaitable objects. A coroutine's execution can be controlled by calling __await__() and iterating over the result. When the coroutine has finished executing and returns, the iterator raises StopIteration , and the exception's value attribute holds the return value. If the coroutine raises an exception, it is propagated by the iterator. Coroutines should not directly raise unhandled StopIteration exceptions.

Coroutines also have the methods listed below, which are analogous to those of generators (see Generator-iterator methods ). However, unlike generators, coroutines do not directly support iteration.

Changed in version 3.5.2: It is a RuntimeError to await on a coroutine more than once.
coroutine. send value
Starts or resumes execution of the coroutine. If value is None , this is equivalent to advancing the iterator returned by __await__() . If value is not None , this method delegates to the send() method of the iterator that caused the coroutine to suspend. The result (return value, StopIteration , or other exception) is the same as when iterating over the __await__() return value, described above.
coroutine. throw type , value traceback ]]
Raises the specified exception in the coroutine. This method delegates to the throw() method of the iterator that caused the coroutine to suspend, if it has such a method. Otherwise, the exception is raised at the suspension point. The result (return value, StopIteration , or other exception) is the same as when iterating over the __await__() return value, described above. If the exception is not caught in the coroutine, it propagates back to the caller.
coroutine. close ()
Causes the coroutine to clean itself up and exit. If the coroutine is suspended, this method first delegates to the close() method of the iterator that caused the coroutine to suspend, if it has such a method. Then it raises GeneratorExit at the suspension point, causing the coroutine to immediately clean itself up. Finally, the coroutine is marked as having finished executing, even if it was never started.

Coroutine objects are automatically closed using the above process when they are about to be destroyed.

3.4.3. Asynchronous Iterators

An asynchronous iterable is able to call asynchronous code in its __aiter__ implementation, and an asynchronous iterator can call asynchronous code in its __anext__ method.

Asynchronous iterators can be used in an async for statement.

object. __aiter__ self
Must return an asynchronous iterator object.
object. __anext__ self
Must return an awaitable resulting in a next value of the iterator. Should raise a StopAsyncIteration error when the iteration is over.

An example of an asynchronous iterable object:

class Reader:
    async def readline(self):
        ...

    def __aiter__(self):
        return self

    async def __anext__(self):
        val = await self.readline()
        if val == b'':
            raise StopAsyncIteration
        return val
New in version 3.5.

Note

Changed in version 3.5.2: Starting with CPython 3.5.2, __aiter__ can directly return asynchronous iterators . Returning an awaitable object will result in a PendingDeprecationWarning .

The recommended way of writing backwards compatible code in CPython 3.5.x is to continue returning awaitables from __aiter__ . If you want to avoid the PendingDeprecationWarning and keep the code backwards compatible, the following decorator can be used:

import functools
import sys

if sys.version_info < (3, 5, 2):
    def aiter_compat(func):
        @functools.wraps(func)
        async def wrapper(self):
            return func(self)
        return wrapper
else:
    def aiter_compat(func):
        return func

Example:

class AsyncIterator:

    @aiter_compat
    def __aiter__(self):
        return self

    async def __anext__(self):
        ...

Starting with CPython 3.6, the PendingDeprecationWarning will be replaced with the DeprecationWarning . In CPython 3.7, returning an awaitable from __aiter__ will result in a RuntimeError . 3.4.4. Asynchronous Context Managers

An asynchronous context manager is a context manager that is able to suspend execution in its __aenter__ and __aexit__ methods.

Asynchronous context managers can be used in an async with statement.

object. __aenter__ self
This method is semantically similar to the __enter__() , with only difference that it must return an awaitable .
object. __aexit__ self , exc_type , exc_value , traceback
This method is semantically similar to the __exit__() , with only difference that it must return an awaitable .

An example of an asynchronous context manager class:

class AsyncContextManager:
    async def __aenter__(self):
        await log('entering context')

    async def __aexit__(self, exc_type, exc, tb):
        await log('exiting context')
New in version 3.5.

Footnotes

[1] It is possible in some cases to change an object's type, under certain controlled conditions. It generally isn't a good idea though, since it can lead to some very strange behaviour if it is handled incorrectly.
[2] The __hash__() , __iter__() , __reversed__() , and __contains__() methods have special handling for this; others will still raise a TypeError , but may do so by relying on the behavior that None is not callable.
[3] "Does not support" here means that the class has no such method, or the method returns NotImplemented . Do not set the method to None if you want to force fallback to the right operand's reflected method -- that will instead have the opposite effect of explicitly blocking such fallback.
[4] For operands of the same type, it is assumed that if the non-reflected method (such as __add__() ) fails the operation is not supported, which is why the reflected method is not called.

[Dec 07, 2017] Variable's memory size in Python - Stack Overflow

Dec 07, 2017 | stackoverflow.com

casevh ,Jan 17, 2013 at 5:03

Regarding the internal structure of a Python long, check sys.int_info (or sys.long_info for Python 2.7).
>>> import sys
>>> sys.int_info
sys.int_info(bits_per_digit=30, sizeof_digit=4)

Python either stores 30 bits into 4 bytes (most 64-bit systems) or 15 bits into 2 bytes (most 32-bit systems). Comparing the actual memory usage with calculated values, I get

>>> import math, sys
>>> a=0
>>> sys.getsizeof(a)
24
>>> a=2**100
>>> sys.getsizeof(a)
40
>>> a=2**1000
>>> sys.getsizeof(a)
160
>>> 24+4*math.ceil(100/30)
40
>>> 24+4*math.ceil(1000/30)
160

There are 24 bytes of overhead for 0 since no bits are stored. The memory requirements for larger values matches the calculated values.

If your numbers are so large that you are concerned about the 6.25% unused bits, you should probably look at the gmpy2 library. The internal representation uses all available bits and computations are significantly faster for large values (say, greater than 100 digits).

[Dec 07, 2017] Variables and scope -- Object-Oriented Programming in Python 1 documentation

Notable quotes:
"... class attributes ..."
"... instance attributes ..."
"... alter the existing value ..."
"... implicit conversion ..."
Dec 07, 2017 | python-textbok.readthedocs.io

Variables and scope Variables

Recall that a variable is a label for a location in memory. It can be used to hold a value. In statically typed languages, variables have predetermined types, and a variable can only be used to hold values of that type. In Python, we may reuse the same variable to store values of any type.

A variable is similar to the memory functionality found in most calculators, in that it holds one value which can be retrieved many times, and that storing a new value erases the old. A variable differs from a calculator's memory in that one can have many variables storing different values, and that each variable is referred to by name.

Defining variables

To define a new variable in Python, we simply assign a value to a label. For example, this is how we create a variable called count , which contains an integer value of zero:

count = 0

This is exactly the same syntax as assigning a new value to an existing variable called count . Later in this chapter we will discuss under what circumstances this statement will cause a new variable to be created.

If we try to access the value of a variable which hasn't been defined anywhere yet, the interpreter will exit with a name error.

We can define several variables in one line, but this is usually considered bad style:

# Define three variables at once:
count, result, total = 0, 0, 0

# This is equivalent to:
count = 0
result = 0
total = 0

In keeping with good programming style, we should make use of meaningful names for variables. Variable scope and lifetime

Not all variables are accessible from all parts of our program, and not all variables exist for the same amount of time. Where a variable is accessible and how long it exists depend on how it is defined. We call the part of a program where a variable is accessible its scope , and the duration for which the variable exists its lifetime .

A variable which is defined in the main body of a file is called a global variable. It will be visible throughout the file, and also inside any file which imports that file. Global variables can have unintended consequences because of their wide-ranging effects – that is why we should almost never use them. Only objects which are intended to be used globally, like functions and classes, should be put in the global namespace.

A variable which is defined inside a function is local to that function. It is accessible from the point at which it is defined until the end of the function, and exists for as long as the function is executing. The parameter names in the function definition behave like local variables, but they contain the values that we pass into the function when we call it. When we use the assignment operator ( ) inside a function, its default behaviour is to create a new local variable – unless a variable with the same name is already defined in the local scope.

Here is an example of variables in different scopes:

# This is a global variable
a = 0

if a == 0:
    # This is still a global variable
    b = 1

def my_function(c):
    # this is a local variable
    d = 3
    print(c)
    print(d)

# Now we call the function, passing the value 7 as the first and only parameter
my_function(7)

# a and b still exist
print(a)
print(b)

# c and d don't exist anymore -- these statements will give us name errors!
print(c)
print(d)

Note

The inside of a class body is also a new local variable scope. Variables which are defined in the class body (but outside any class method) are called class attributes . They can be referenced by their bare names within the same scope, but they can also be accessed from outside this scope if we use the attribute access operator ( ) on a class or an instance (an object which uses that class as its type). An attribute can also be set explicitly on an instance or class from inside a method. Attributes set on instances are called instance attributes . Class attributes are shared between all instances of a class, but each instance has its own separate instance attributes. We will look at this in greater detail in the chapter about classes. The assignment operator

As we saw in the previous sections, the assignment operator in Python is a single equals sign ( ). This operator assigns the value on the right hand side to the variable on the left hand side, sometimes creating the variable first. If the right hand side is an expression (such as an arithmetic expression), it will be evaluated before the assignment occurs. Here are a few examples:

a_number = 5              # a_number becomes 5
a_number = total          # a_number becomes the value of total
a_number = total + 5      # a_number becomes the value of total + 5
a_number = a_number + 1   # a_number becomes the value of a_number + 1

The last statement might look a bit strange if we were to interpret as a mathematical equals sign – clearly a number cannot be equal to the same number plus one! Remember that is an assignment operator – this statement is assigning a new value to the variable a_number which is equal to the old value of a_number plus one.

Assigning an initial value to variable is called initialising the variable. In some languages defining a variable can be done in a separate step before the first value assignment. It is thus possible in those languages for a variable to be defined but not have a value – which could lead to errors or unexpected behaviour if we try to use the value before it has been assigned. In Python a variable is defined and assigned a value in a single step, so we will almost never encounter situations like this.

The left hand side of the assignment statement must be a valid target:

# this is fine:
a = 3

# these are all illegal:
3 = 4
3 = a
a + b = 3

An assignment statement may have multiple targets separated by equals signs. The expression on the right hand side of the last equals sign will be assigned to all the targets. All the targets must be valid:

# both a and b will be set to zero:
a = b = 0

# this is illegal, because we can't set 0 to b:
a = 0 = b
Compound assignment operators

We have already seen that we can assign the result of an arithmetic expression to a variable:

total = a + b + c + 50

Counting is something that is done often in a program. For example, we might want to keep count of how many times a certain event occurs by using a variable called count . We would initialise this variable to zero and add one to it every time the event occurs. We would perform the addition with this statement:

count = count + 1

This is in fact a very common operation. Python has a shorthand operator, += , which lets us express it more cleanly, without having to write the name of the variable twice:

# These statements mean exactly the same thing:
count = count + 1
count += 1

# We can increment a variable by any number we like.
count += 2
count += 7
count += a + b

There is a similar operator, -= , which lets us decrement numbers:

# These statements mean exactly the same thing:
count = count - 3
count -= 3

Other common compound assignment operators are given in the table below:

Operator Example Equivalent to
+= +=
-= -=
*= *=
/= /=
%= %=
More about scope: crossing boundaries

What if we want to access a global variable from inside a function? It is possible, but doing so comes with a few caveats:

a = 0

def my_function():
    print(a)

my_function()

The print statement will output , the value of the global variable , as you probably expected. But what about this program?

a = 0

def my_function():
    a = 3
    print(a)

my_function()

print(a)

When we call the function, the print statement inside outputs – but why does the print statement at the end of the program output ?

By default, the assignment statement creates variables in the local scope. So the assignment inside the function does not modify the global variable – it creates a new local variable called , and assigns the value to that variable. The first print statement outputs the value of the new local variable – because if a local variable has the same name as a global variable the local variable will always take precedence. The last print statement prints out the global variable, which has remained unchanged.

What if we really want to modify a global variable from inside a function? We can use the global keyword:

a = 0

def my_function():
    global a
    a = 3
    print(a)

my_function()

print(a)

We may not refer to both a global variable and a local variable by the same name inside the same function. This program will give us an error:

a = 0

def my_function():
    print(a)
    a = 3
    print(a)

my_function()

Because we haven't declared to be global, the assignment in the second line of the function will create a local variable . This means that we can't refer to the global variable elsewhere in the function, even before this line! The first print statement now refers to the local variable – but this variable doesn't have a value in the first line, because we haven't assigned it yet!

Note that it is usually very bad practice to access global variables from inside functions, and even worse practice to modify them. This makes it difficult to arrange our program into logically encapsulated parts which do not affect each other in unexpected ways. If a function needs to access some external value, we should pass the value into the function as a parameter. If the function is a method of an object, it is sometimes appropriate to make the value an attribute of the same object – we will discuss this in the chapter about object orientation.

Note

There is also a nonlocal keyword in Python – when we nest a function inside another function, it allows us to modify a variable in the outer function from inside the inner function (or, if the function is nested multiple times, a variable in one of the outer functions). If we use the global keyword, the assignment statement will create the variable in the global scope if it does not exist already. If we use the nonlocal keyword, however, the variable must be defined, because it is impossible for Python to determine in which scope it should be created. Exercise 1

  1. Describe the scope of the variables a , , and in this example:

    def my_function(a):
        b = a - 2
        return b
    
    c = 3
    
    if c > 2:
        d = my_function(5)
        print(d)
    
  2. What is the lifetime of these variables? When will they be created and destroyed?

  3. Can you guess what would happen if we were to assign a value of instead?

  4. Why would this be a problem? Can you think of a way to avoid it?

Modifying values Constants

In some languages, it is possible to define special variables which can be assigned a value only once – once their values have been set, they cannot be changed. We call these kinds of variables constants . Python does not allow us to set such a restriction on variables, but there is a widely used convention for marking certain variables to indicate that their values are not meant to change: we write their names in all caps, with underscores separating words:

# These variables are "constants" by convention:
NUMBER_OF_DAYS_IN_A_WEEK = 7
NUMBER_OF_MONTHS_IN_A_YEAR = 12

# Nothing is actually stopping us from redefining them...
NUMBER_OF_DAYS_IN_A_WEEK = 8

# ...but it's probably not a good idea.

Why do we bother defining variables that we don't intend to change? Consider this example:

MAXIMUM_MARK = 80

tom_mark = 58
print(("Tom's mark is %.2f%%" % (tom_mark / MAXIMUM_MARK * 100)))
# %% is how we escape a literal % inside a string

There are several good reasons to define MAXIMUM_MARK instead of just writing 80 inside the print statement. First, this gives the number a descriptive label which explains what it is – this makes the code more understandable. Second, we may eventually need to refer to this number in our program more than once. If we ever need to update our code with a new value for the maximum mark, we will only have to change it in one place, instead of finding every place where it is used – such replacements are often error-prone.

Literal numbers scattered throughout a program are known as "magic numbers" – using them is considered poor coding style. This does not apply to small numbers which are considered self-explanatory – it's easy to understand why a total is initialised to zero or incremented by one.

Sometimes we want to use a variable to distinguish between several discrete options. It is useful to refer to the option values using constants instead of using them directly if the values themselves have no intrinsic meaning:

# We define some options
LOWER, UPPER, CAPITAL = 1, 2, 3

name = "jane"
# We use our constants when assigning these values...
print_style = UPPER

# ...and when checking them:
if print_style == LOWER:
    print(name.lower())
elif print_style == UPPER:
    print(name.upper())
elif print_style == CAPITAL:
    print(name.capitalize())
else:
    # Nothing prevents us from accidentally setting print_style to 4, 90 or
    # "spoon", so we put in this fallback just in case:
    print("Unknown style option!")

In the above example, the values , and are not important – they are completely meaningless. We could equally well use , and or the strings 'lower' , 'upper' and 'capital' . The only important thing is that the three values must be different. If we used the numbers directly instead of the constants the program would be much more confusing to read. Using meaningful strings would make the code more readable, but we could accidentally make a spelling mistake while setting one of the values and not notice – if we mistype the name of one of the constants we are more likely to get an error straight away.

Some Python libraries define common constants for our convenience, for example:

# we need to import these libraries before we use them
import string
import math
import re

# All the lowercase ASCII letters: 'abcdefghijklmnopqrstuvwxyz'
print(string.ascii_lowercase)

# The mathematical constants pi and e, both floating-point numbers
print(math.pi) # ratio of circumference of a circle to its diameter
print(math.e) # natural base of logarithms

# This integer is an option which we can pass to functions in the re
# (regular expression) library.
print(re.IGNORECASE)

Note that many built-in constants don't follow the all-caps naming convention. Mutable and immutable types

Some values in python can be modified, and some cannot. This does not ever mean that we can't change the value of a variable – but if a variable contains a value of an immutable type , we can only assign it a new value . We cannot alter the existing value in any way.

Integers, floating-point numbers and strings are all immutable types – in all the previous examples, when we changed the values of existing variables we used the assignment operator to assign them new values:

a = 3
a = 2

b = "jane"
b = "bob"

Even this operator doesn't modify the value of total in-place – it also assigns a new value:

total += 4

We haven't encountered any mutable types yet, but we will use them extensively in later chapters. Lists and dictionaries are mutable, and so are most objects that we are likely to write ourselves:

# this is a list of numbers
my_list = [1, 2, 3]
my_list[0] = 5 # we can change just the first element of the list
print(my_list)

class MyClass(object):
    pass # this is a very silly class

# Now we make a very simple object using our class as a type
my_object = MyClass()

# We can change the values of attributes on the object
my_object.some_property = 42
More about input

In the earlier sections of this unit we learned how to make a program display a message using the print function or read a string value from the user using the input function. What if we want the user to input numbers or other types of variables? We still use the input function, but we must convert the string values returned by input to the types that we want. Here is a simple example:

height = int(input("Enter height of rectangle: "))
width = int(input("Enter width of rectangle: "))

print("The area of the rectangle is %d" % (width * height))

int is a function which converts values of various types to ints. We will discuss type conversion in greater detail in the next section, but for now it is important to know that int will not be able to convert a string to an integer if it contains anything except digits. The program above will exit with an error if the user enters "aaa" , "zzz10" or even "7.5" . When we write a program which relies on user input, which can be incorrect, we need to add some safeguards so that we can recover if the user makes a mistake. For example, we can detect if the user entered bad input and exit with a nicer error message:

try:
    height = int(input("Enter height of rectangle: "))
    width = int(input("Enter width of rectangle: "))
except ValueError as e: # if a value error occurs, we will skip to this point
    print("Error reading height and width: %s" % e)

This program will still only attempt to read in the input once, and exit if it is incorrect. If we want to keep asking the user for input until it is correct, we can do something like this:

correct_input = False # this is a boolean value -- it can be either true or false.

while not correct_input: # this is a while loop
    try:
        height = int(input("Enter height of rectangle: "))
        width = int(input("Enter width of rectangle: "))
    except ValueError:
        print("Please enter valid integers for the height and width.")
    else: # this will be executed if there is no value error
        correct_input = True

We will learn more about boolean values, loops and exceptions later. Example: calculating petrol consumption of a car

In this example, we will write a simple program which asks the user for the distance travelled by a car, and the monetary value of the petrol that was used to cover that distance. From this information, together with the price per litre of petrol, the program will calculate the efficiency of the car, both in litres per 100 kilometres and and kilometres per litre.

First we will define the petrol price as a constant at the top. This will make it easy for us to update the price when it changes on the first Wednesday of every month:

PETROL_PRICE_PER_LITRE = 4.50

When the program starts,we want to print out a welcome message:

print("*** Welcome to the fuel efficiency calculator! ***\n")
# we add an extra blank line after the message with \n

Ask the user for his or her name:

name = input("Enter your name: ")

Ask the user for the distance travelled:

# float is a function which converts values to floating-point numbers.
distance_travelled = float(input("Enter distance travelled in km: "))

Then ask the user for the amount paid:

amount_paid = float(input("Enter monetary value of fuel bought for the trip: R"))

Now we will do the calculations:

fuel_consumed = amount_paid / PETROL_PRICE_PER_LITRE

efficiency_l_per_100_km = fuel_consumed / distance_travelled * 100
efficiency_km_per_l = distance_travelled / fuel_consumed

Finally, we output the results:

print("Hi, %s!" % name)
print("Your car's efficiency is %.2f litres per 100 km." % efficiency_l_per_100_km)
print("This means that you can travel %.2f km on a litre of petrol." % efficiency_km_per_l)

# we add an extra blank line before the message with \n
print("\nThanks for using the program.")
Exercise 2
  1. Write a Python program to convert a temperature given in degrees Fahrenheit to its equivalent in degrees Celsius. You can assume that T_c = (5/9) x (T_f - 32) , where T_c is the temperature in °C and T_f is the temperature in °F. Your program should ask the user for an input value, and print the output. The input and output values should be floating-point numbers.
  2. What could make this program crash? What would we need to do to handle this situation more gracefully?
Type conversion

As we write more programs, we will often find that we need to convert data from one type to another, for example from a string to an integer or from an integer to a floating-point number. There are two kinds of type conversions in Python: implicit and explicit conversions.

Implicit conversion

Recall from the section about floating-point operators that we can arbitrarily combine integers and floating-point numbers in an arithmetic expression – and that the result of any such expression will always be a floating-point number. This is because Python will convert the integers to floating-point numbers before evaluating the expression. This is an implicit conversion – we don't have to convert anything ourselves. There is usually no loss of precision when an integer is converted to a floating-point number.

For example, the integer will automatically be converted to a floating-point number in the following example:

result = 8.5 * 2

8.5 is a float while is an int . Python will automatically convert operands so that they are of the same type. In this case this is achieved if the integer is converted to the floating-point equivalent 2.0 . Then the two floating-point numbers can be multiplied.

Let's have a look at a more complex example:

result = 8.5 + 7 // 3 - 2.5

Python performs operations according to the order of precedence, and decides whether a conversion is needed on a per-operation basis. In our example // has the highest precedence, so it will be processed first. and are both integers and // is the integer division operator – the result of this operation is the integer . Now we are left with 8.5 2.5 . The addition and subtraction are at the same level of precedence, so they are evaluated left-to-right, starting with addition. First is converted to the floating-point number 2.0 , and the two floating-point numbers are added, which leaves us with 10.5 2.5 . The result of this floating-point subtraction is 2.0 , which is assigned to result . Explicit conversion

Converting numbers from float to int will result in a loss of precision. For example, try to convert 5.834 to an int – it is not possible to do this without losing precision. In order for this to happen, we must explicitly tell Python that we are aware that precision will be lost. For example, we need to tell the compiler to convert a float to an int like this:

i = int(5.834)

The int function converts a float to an int by discarding the fractional part – it will always round down! If we want more control over the way in which the number is rounded, we will need to use a different function:

# the floor and ceil functions are in the math module
import math

# ceil returns the closest integer greater than or equal to the number
# (so it always rounds up)
i = math.ceil(5.834)

# floor returns the closest integer less than or equal to the number
# (so it always rounds down)
i = math.floor(5.834)

# round returns the closest integer to the number
# (so it rounds up or down)
# Note that this is a built-in function -- we don't need to import math to use it.
i = round(5.834)

Explicit conversion is sometimes also called casting – we may read about a float being cast to int or vice-versa. Converting to and from strings

As we saw in the earlier sections, Python seldom performs implicit conversions to and from str – we usually have to convert values explicitly. If we pass a single number (or any other value) to the print function, it will be converted to a string automatically – but if we try to add a number and a string, we will get an error:

# This is OK
print(5)
print(6.7)

# This is not OK
print("3" + 4)

# Do you mean this...
print("3%d" % 4) # concatenate "3" and "4" to get "34"

# Or this?
print(int("3") + 4) # add 3 and 4 to get 7

To convert numbers to strings, we can use string formatting – this is usually the cleanest and most readable way to insert multiple values into a message. If we want to convert a single number to a string, we can also use the str function explicitly:

# These lines will do the same thing
print("3%d" % 4)
print("3" + str(4))
More about conversions

In Python, functions like str , int and float will try to convert anything to their respective types – for example, we can use the int function to convert strings to integers or to convert floating-point numbers to integers. Note that although int can convert a float to an integer it can't convert a string containing a float to an integer directly!

# This is OK
int("3")

# This is OK
int(3.7)

# This is not OK
int("3.7") # This is a string representation of a float, not an integer!

# We have to convert the string to a float first
int(float("3.7"))

Values of type bool can contain the value True or False . These values are used extensively in conditional statements, which execute or do not execute parts of our program depending on some binary condition:

my_flag = True

if my_flag:
    print("Hello!")

The condition is often an expression which evaluates to a boolean value:

if 3 > 4:
    print("This will not be printed.")

However, almost any value can implicitly be converted to a boolean if it is used in a statement like this:

my_number = 3

if my_number:
    print("My number is non-zero!")

This usually behaves in the way that you would expect: non-zero numbers are True values and zero is False . However, we need to be careful when using strings – the empty string is treated as False , but any other string is True – even "0" and "False" !

# bool is a function which converts values to booleans
bool(34) # True
bool(0) # False
bool(1) # True

bool("") # False
bool("Jane") # True
bool("0") # True!
bool("False") # Also True!
Exercise 3
  1. Convert "8.8" to a float.
  2. Convert 8.8 to an integer (with rounding).
  3. Convert "8.8" to an integer (with rounding).
  4. Convert 8.8 to a string.
  5. Convert to a string.
  6. Convert to a float.
  7. Convert to a boolean.
Answers to exercises Answer to exercise 1
  1. is a local variable in the scope of my_function because it is an argument name. is also a local variable inside my_function , because it is assigned a value inside my_function . and are both global variables. It doesn't matter that is created inside an if block, because the inside of an if block is not a new scope – everything inside the block is part of the same scope as the outside (in this case the global scope). Only function definitions (which start with def ) and class definitions (which start with class ) indicate the start of a new level of scope.
  2. Both and will be created every time my_function is called and destroyed when my_function has finished executing. is created when it is assigned the value , and exists for the remainder of the program's execution. is created inside the if block (when it is assigned the value which is returned from the function), and also exists for the remainder of the program's execution.
  3. As we will learn in the next chapter, if blocks are executed conditionally . If were not greater than in this program, the if block would not be executed, and if that were to happen the variable would never be created.
  4. We may use the variable later in the code, assuming that it always exists, and have our program crash unexpectedly if it doesn't. It is considered poor coding practice to allow a variable to be defined or undefined depending on the outcome of a conditional statement. It is better to ensure that is always defined, no matter what – for example, by assigning it some default value at the start. It is much easier and cleaner to check if a variable has the default value than to check whether it exists at all.
Answer to exercise 2
  1. Here is an example program:

    T_f = float(input("Please enter a temperature in °F: "))
    T_c = (5/9) * (T_f - 32)
    print("%g°F = %g°C" % (T_f, T_c))
    

    Note

    The formatting symbol %g is used with floats, and instructs Python to pick a sensible human-readable way to display the float.

  2. The program could crash if the user enters a value which cannot be converted to a floating-point number. We would need to add some kind of error checking to make sure that this doesn't happen – for example, by storing the string value and checking its contents. If we find that the entered value is invalid, we can either print an error message and exit or keep prompting the user for input until valid input is entered.

Answer to exercise 3

Here are example answers:

import math

a_1 = float("8.8")
a_2 = math.round(8.8)
a_3 = math.round("8.8")
a_4 = "%g" % 8.8
a_5 = "%d" % 8
a_6 = float(8)
a_7 = bool(8)
Next Previous
© Copyright 2013, 2014, University of Cape Town and individual contributors. This work is released under the CC BY-SA 4.0 licence. Revision 8e685e710775 . Built with Sphinx using a theme provided by Read the Docs .

[Dec 07, 2017] BitManipulation - Python Wiki

Dec 07, 2017 | wiki.python.org

Here is some information and goals related to Python bit manipulation, binary manipulation.

Some tasks include:

  • Turn "11011000111101..." into bytes, (padded left or right, 0 or 1,) and vice versa.
  • Slice ranges of bits
  • Rotate bits, addressed by the bit. That is, say: "rotate bits 13-17, wrapping around the edges," or, "rotate bits 13-17, lose bits on the one side, set all new bits to 0."
  • Similarly, revert regions of bits, apply logic to regions of bits, etc.,.
  • Switch Endianness, with different block sizes.
  • Apply operations in block groupings: ex: apply XOR 10101 (5 bits) repeatedly across a field.

Relevant libraries include:

Some simple code is at ASPN: bit-field manipulation.

Here are some other examples.

Manipulations

To integer.

Toggle line numbers
   1 >>> print int('00100001', 2)
   2 33

To hex string. Note that you don't need to use x8 bits.

Toggle line numbers
   1 >>> print "0x%x" % int('11111111', 2)
   2 0xff
   3 >>> print "0x%x" % int('0110110110', 2)
   4 0x1b6
   5 >>> print "0x%x" % int('0010101110101100111010101101010111110101010101', 2)
   6 0xaeb3ab57d55

To character. 8 bits max.

Toggle line numbers
   1 >>> chr(int('111011', 2))
   2 ';'
   3 >>> chr(int('1110110', 2))
   4 'v'
   5 >>> chr(int('11101101', 2))
   6 '\xed'

Characters to integers, but not to strings of 1's and 0's.

Toggle line numbers
   1 >>> int('01110101', 2)
   2 117
   3 >>> chr(int('01110101', 2))
   4 'u'
   5 >>> ord('u')
   6 117

Individual bits.

Toggle line numbers
   1 >>> 1 << 0
   2 1
   3 >>> 1 << 1
   4 2
   5 >>> 1 << 2
   6 4
   7 >>> 1 << 3
   8 8
   9 >>> 1 << 4
  10 16
  11 >>> 1 << 5
  12 32
  13 >>> 1 << 6
  14 64
  15 >>> 1 << 7
  16 128
Transformations Summary

Strings to Integers:

  • "1011101101" : int(str, 2)

  • "m" : ord(str)

  • "0xdecafbad" : int(str, 16) (known to work in Python 2.4)

  • "decafbad" : int(str, 16) (known to work in Python 2.4)

Integers to Strings:

  • "1011101101" : built-in to Python 3 (see below)

  • "m" : chr(str)

  • "0xdecafbad" : hex(val)

  • "decafbad" : "%x" % val

We are still left without a technique for producing binary strings, and decyphering hex strings.

Hex String to Integer

Use the int type with the base argument:

Toggle line numbers
   1 >>> int('0xff',16)
   2 255
   3 >>> int('d484fa894e',16)
   4 912764078414

Do not use alternatives that utilize eval. eval will execute code passed to it and can thus compromise the security of your program.

Integer to Bin String

Python 3 supports binary literals (e.g. 0b10011000) and has a bin() function. For older versions:

Toggle line numbers
   1 >>> def bin(a):
   2         s=''
   3         t={'0':'000','1':'001','2':'010','3':'011',
   4            '4':'100','5':'101','6':'110','7':'111'}
   5         for c in oct(a)[1:]:
   6                 s+=t[c]
   7         return s

or better:

Toggle line numbers
   1 def bin(s):
   2     return str(s) if s<=1 else bin(s>>1) + str(s&1)
Python Integers

From "The Python Language Reference" page on the Data Model:

"Integers (int) These represent numbers in an unlimited range, subject to available (virtual) memory only. For the purpose of shift and mask operations, a binary representation is assumed, and negative numbers are represented in a variant of 2's complement which gives the illusion of an infinite string of sign bits extending to the left."

Prior to Python 3.1, there was no easy way to determine how Python represented a specific integer internally, i.e. how many bits were used. Python 3.1 adds a bit_length() method to the int type that does exactly that.

Unless you know you are working with numbers that are less than a certain length, for instance numbers from arrays of integers, shifts, rotations, etc. may give unexpected results.

The number of the highest bit set is the highest power of 2 less than or equal to the input integer. This is the same as the exponent of the floating point representation of the integer, and is also called its "integer log base 2".(ref.1)

In versions before 3.1, the easiest way to determine the highest bit set is*:

* There is a long discussion on this topic, and why this method is not good, in "Issue 3439" at Python.org: http://bugs.python.org/issue3439 This discussion led up to the addition of bit_length() in Python 3.1.

Toggle line numbers
   1 import math
   2 
   3 hiBit = math.floor(math.log(int_type, 2))

An input less than or equal to 0 results in a " ValueError : math domain error"

The section "Finding integer log base 2 of an integer" on the "Bit Twiddling Hacks"(ref.1) web page includes a number of methods for determining this value for integers of known magnitude, presumably when no math coprocessor is available. The only method generally applicable to Python integers of unknown magnitude is the "obvious way" of counting the number of bitwise shift operations needed to reduce the input to 0.

Bit Length Of a Python Integer

bitLen() counts the actual bit length of a Python integer, that is, the number of the highest non-zero bit plus 1 . Zero, with no non-zero bit, returns 0. As should be expected from the quote above about "the illusion of an infinite string of sign bits extending to the left," a negative number throws the computer into an infinite loop.

The function can return any result up to the length of the largest integer your computer's memory can hold.

Toggle line numbers
   1 def bitLen(int_type):
   2     length = 0
   3     while (int_type):
   4         int_type >>= 1
   5         length += 1
   6     return(length)
   7 
   8 for i in range(17):
   9      print(bitLen(i))
  10 
  11 # results: 0, 1, 2, 2, 3, 3, 3, 3, 4, 4, 4, 4, 4, 4, 4, 4, 5

The method using the math module is much faster, especially on huge numbers with hundreds of decimal digits.

bitLenCount()

In common usage, the "bit count" of an integer is the number of set (1) bits, not the bit length of the integer described above. bitLen() can be modified to also provide the count of the number of set bits in the integer. There are faster methods to get the count below.

Toggle line numbers
   1 def bitLenCount(int_type):
   2     length = 0
   3     count = 0
   4     while (int_type):
   5         count += (int_type & 1)
   6         length += 1
   7         int_type >>= 1
   8     return(length, count)
Operations on Integers of Unknown Magnitude

Some procedures don't need to know the magnitude of an integer to give meaningful results.

bitCount()

The procedure and the information below were found in "Bit Twiddling Hacks"(ref.1)

  • - - - - - - - - - - - - - - - - - - - - - - - -

Counting bits set, Brian Kernighan's way*

Toggle line numbers
   1 unsigned int v;          // count the number of bits set in v
   2 unsigned int c;          // c accumulates the total bits set in v
   3 for (c = 0; v; c++)
   4 {   v &= v - 1;  }       //clear the least significant bit set

This method goes through as many iterations as there are set bits. So if we have a 32-bit word with only the high bit set, then it will only go once through the loop.

* The C Programming Language 2nd Ed., Kernighan & Ritchie, 1988.

Don Knuth pointed out that this method was published by Peter Wegner in CACM 3 (1960), 322. Also discovered independently by Derrick Lehmer and published in 1964 in a book edited by Beckenbach.

  • - - - - - - - - - - - - - - - - - - - - - - - -

Kernighan and Knuth, potent endorsements!

This works because each subtraction "borrows" from the lowest 1-bit. For example:

Toggle line numbers
   1 #       loop pass 1                 loop pass 2
   2 #      101000     101000           100000     100000
   3 #    -   #!python
   4   & 100111         -   #!python
   5   & 011111
   6 #    = 100111   = 100000         = 011111   =      0

It is an excellent technique for Python, since the size of the integer need not be determined beforehand.

Toggle line numbers
   1 def bitCount(int_type):
   2     count = 0
   3     while(int_type):
   4         int_type &= int_type - 1
   5         count += 1
   6     return(count)
parityOf()

From "Bit Twiddling Hacks"

Code almost identical to bitCount(), above, calculates the parity of an integer, returning 0 if there are an even number of set bits, and -1 if there are an odd number. In fact, counting the bits and checking whether the result is odd with bitcount & 1 is about the same speed as the parity function.

Toggle line numbers
   1 def parityOf(int_type):
   2     parity = 0
   3     while (int_type):
   4         parity = ~parity
   5         int_type = int_type & (int_type - 1)
   6     return(parity)
lowestSet()

To determine the bit number of the lowest bit set in an integer, in twos-complement notation i & -i zeroes all but the lowest set bit. The bitLen() proceedure then determines its position. Obviously, negative numbers return the same result as their opposite. In this version, an input of 0 returns -1, in effect an error condition.

Toggle line numbers
   1 For example:
   2 #    00111000     # 56
   3 #    11001000     # twos complement, -56
   4 # &= 00001000
Toggle line numbers
   1 def lowestSet(int_type):
   2     low = (int_type & -int_type)
   3     lowBit = -1
   4     while (low):
   5         low >>= 1
   6         lowBit += 1
   7     return(lowBit)
Single bits

The usual single-bit operations will work on any Python integer. It is up to the programmer to be sure that the value of 'offset' makes sense in the context of the program.

Toggle line numbers
   1 # testBit() returns a nonzero result, 2**offset, if the bit at 'offset' is one.
   2 
   3 def testBit(int_type, offset):
   4     mask = 1 << offset
   5     return(int_type & mask)
   6 
   7 # setBit() returns an integer with the bit at 'offset' set to 1.
   8 
   9 def setBit(int_type, offset):
  10     mask = 1 << offset
  11     return(int_type | mask)
  12 
  13 # clearBit() returns an integer with the bit at 'offset' cleared.
  14 
  15 def clearBit(int_type, offset):
  16     mask = ~(1 << offset)
  17     return(int_type & mask)
  18 
  19 # toggleBit() returns an integer with the bit at 'offset' inverted, 0 -> 1 and 1 -> 0.
  20 
  21 def toggleBit(int_type, offset):
  22     mask = 1 << offset
  23     return(int_type ^ mask)
Bit fields, e.g. for communication protocols

If you need to interpret individual bits in some data, e.g. a byte stream in a communications protocol, you can use the ctypes module.

Toggle line numbers
   1 import ctypes
   2 c_uint8 = ctypes.c_uint8
   3 
   4 class Flags_bits( ctypes.LittleEndianStructure ):
   5     _fields_ = [
   6                 ("logout",     c_uint8, 1 ),  # asByte & 1
   7                 ("userswitch", c_uint8, 1 ),  # asByte & 2
   8                 ("suspend",    c_uint8, 1 ),  # asByte & 4
   9                 ("idle",       c_uint8, 1 ),  # asByte & 8
  10                ]
  11 
  12 class Flags( ctypes.Union ):
  13     _anonymous_ = ("bit",)
  14     _fields_ = [
  15                 ("bit",    Flags_bits ),
  16                 ("asByte", c_uint8    )
  17                ]
  18 
  19 flags = Flags()
  20 flags.asByte = 0x2  # ->0010
  21 
  22 print( "logout: %i"      % flags.bit.logout   )
  23 # `bit` is defined as anonymous field, so its fields can also be accessed directly:
  24 print( "logout: %i"      % flags.logout     )
  25 print( "userswitch:  %i" % flags.userswitch )
  26 print( "suspend   :  %i" % flags.suspend    )
  27 print( "idle  : %i"      % flags.idle       )
Toggle line numbers
   1 >>> 
   2 logout: 0
   3 logout: 0
   4 userswitch:  1
   5 suspend   :  0
   6 idle  : 0
References

ref.1. "Bit Twiddling Hacks" By Sean Eron Anderson

ref.2. "The Art of Assembly Language" by Randall Hyde

ref.3. Hacker's Delight

[Dec 07, 2017] foobarnbaz.com - Understanding Python variables and Memory Management

Dec 07, 2017 | foobarnbaz.com

Understanding Python variables and Memory Management Jul 08, 2012

Have you ever noticed any difference between variables in Python and C? For example, when you do an assignment like the following in C, it actually creates a block of memory space so that it can hold the value for that variable.

int a = 1;

You can think of it as putting the value assigned in a box with the variable name as shown below.

int a =1;

And for all the variables you create a new box is created with the variable name to hold the value. If you change the value of the variable the box will be updated with the new value. That means doing

a = 2;

will result in

a = 2;

Assigning one variable to another makes a copy of the value and put that value in the new box.

int b = a;

b=2 a = 2

But in Python variables work more like tags unlike the boxes you have seen before. When you do an assignment in Python, it tags the value with the variable name.

a = 1

a = 1

and if you change the value of the varaible, it just changes the tag to the new value in memory. You dont need to do the housekeeping job of freeing the memory here. Python's Automatic Garbage Collection does it for you. When a value is without names/tags it is automatically removed from memory.

a = 2

a = 2 1

Assigning one variable to another makes a new tag bound to the same value as show below.

b = a

b = a

Other languages have 'variables'. Python has 'names'.

A bit about Python's memory management

As you have seen before, a value will have only one copy in memory and all the variables having this value will refer to this memory location. For example when you have variables a , b , c having a value 10, it doesn't mean that there will be 3 copy of 10 s in memory. There will be only one 10 and all the variables a , b , c will point to this value. Once a variable is updated, say you are doing a += 1 a new value 11 will be allocated in memory and a will be pointing to this.

Let's check this behaviour with Python Interpreter. Start the Python Shell and try the following for yourselves.

>>> a = 10
>>> b = 10
>>> c = 10
>>> id(a), id(b), id(c)
(140621897573616, 140621897573616, 140621897573616)
>>> a += 1
>>> id(a)
140621897573592

id() will return an objects memory address (object's identity). As you have noticed, when you assign the same integer value to the variables, we see the same ids. But this assumption does not hold true all the time. See the following for example

>>> x = 500
>>> y = 500
>>> id(x)
4338740848
>>> id(y)
4338741040

What happened here? Even after assigning the same integer values to different variable names, we are getting two different ids here. These are actually the effects of CPython optimization we are observing here. CPython implementation keeps an array of integer objects for all integers between -5 and 256. So when we create an integer in that range, they simply back reference to the existing object. You may refer the following links for more information.

Stack Overflow: "is" operator behaves unexpectedly with integers

Let's take a look at strings now.

>>> s1 = 'hello'
>>> s2 = 'hello'
>>> id(s1), id(s2)
(4454725888, 4454725888)
>>> s1 == s2
True
>>> s1 is s2
True
>>> s3 = 'hello, world!'
>>> s4 = 'hello, world!'
>>> id(s3), id(s4)
(4454721608, 4454721664)
>>> s3 == s4
True
>>> s3 is s4
False

Looks interesting, isn't it? When the string was a simple and shorter one the variable names where referring to the same object in memory. But when they became bigger, this was not the case. This is called interning, and Python does interning (to some extent) of shorter string literals (as in s1 and s2 ) which are created at compile time. But in general, Python string literals creates a new string object each time (as in s3 and s4 ). Interning is runtime dependant and is always a trade-off between memory use and the cost of checking if you are creating the same string. There's a built-in intern() function to forcefully apply interning. Read more about interning from the following links.

Stack Overflow: Does Python intern Strings?
Stack Overflow: Python String Interning
Internals of Python String Interning

Now we will try to create custom objects and try to find their identities.

>>> class Foo:
...     pass
...
>>> bar = Foo()
>>> baz = Foo()
>>> id(bar)
140730612513248
>>> id(baz)
140730612513320

As you can see, the two instances have different identities. That means, there are two different copies of the same object in memory. When you are creating custom objects, they will have unique identities unless you are using Singleton Pattern which overrides this behaviour (in __new__() ) by giving out the same instance upon instance creation.

Thanks to Jay Pablo (see comments) for correcting the mistakes and making this post a better one.

[Dec 07, 2017] In what structure is a Python object stored in memory - Stack Overflow

Dec 07, 2017 | stackoverflow.com

In what structure is a Python object stored in memory? [duplicate] Ask Question up vote down vote favorite 1

; ,Nov 1, 2010 at 4:34

Possible Duplicate:
How do I determine the size of an object in Python?

Say I have a class A:

class A(object):
    def __init__(self, x):
        self.x = x

    def __str__(self):
        return self.x

And I use sys.getsizeof to see how many bytes instance of A takes:

>>> sys.getsizeof(A(1))
64
>>> sys.getsizeof(A('a'))
64
>>> sys.getsizeof(A('aaa'))
64

As illustrated in the experiment above, the size of an A object is the same no matter what self.x is.

So I wonder how python store an object internally?

Björn Pollex ,Oct 31, 2010 at 10:38

This is certain to differ over python implementations. Which one are you talking about? – Björn Pollex Oct 31 '10 at 10:38

Thomas Wouters ,Oct 31, 2010 at 11:26

It depends on what kind of object, and also which Python implementation :-)

In CPython, which is what most people use when they use python , all Python objects are represented by a C struct, PyObject . Everything that 'stores an object' really stores a PyObject * . The PyObject struct holds the bare minimum information: the object's type (a pointer to another PyObject ) and its reference count (an ssize_t -sized integer.) Types defined in C extend this struct with extra information they need to store in the object itself, and sometimes allocate extra data separately.

For example, tuples (implemented as a PyTupleObject "extending" a PyObject struct) store their length and the PyObject pointers they contain inside the struct itself (the struct contains a 1-length array in the definition, but the implementation allocates a block of memory of the right size to hold the PyTupleObject struct plus exactly as many items as the tuple should hold.) The same way, strings ( PyStringObject ) store their length, their cached hashvalue, some string-caching ("interning") bookkeeping, and the actual char* of their data. Tuples and strings are thus single blocks of memory.

On the other hand, lists ( PyListObject ) store their length, a PyObject ** for their data and another ssize_t to keep track of how much room they allocated for the data. Because Python stores PyObject pointers everywhere, you can't grow a PyObject struct once it's allocated -- doing so may require the struct to move, which would mean finding all pointers and updating them. Because a list may need to grow, it has to allocate the data separately from the PyObject struct. Tuples and strings cannot grow, and so they don't need this. Dicts ( PyDictObject ) work the same way, although they store the key, the value and the cached hashvalue of the key, instead of just the items. Dict also have some extra overhead to accommodate small dicts and specialized lookup functions.

But these are all types in C, and you can usually see how much memory they would use just by looking at the C source. Instances of classes defined in Python rather than C are not so easy. The simplest case, instances of classic classes, is not so difficult: it's a PyObject that stores a PyObject * to its class (which is not the same thing as the type stored in the PyObject struct already), a PyObject * to its __dict__ attribute (which holds all other instance attributes) and a PyObject * to its weakreflist (which is used by the weakref module, and only initialized if necessary.) The instance's __dict__ is usually unique to the instance, so when calculating the "memory size" of such an instance you usually want to count the size of the attribute dict as well. But it doesn't have to be specific to the instance! __dict__ can be assigned to just fine.

New-style classes complicate manners. Unlike with classic classes, instances of new-style classes are not separate C types, so they do not need to store the object's class separately. They do have room for the __dict__ and weakreflist reference, but unlike classic instances they don't require the __dict__ attribute for arbitrary attributes. if the class (and all its baseclasses) use __slots__ to define a strict set of attributes, and none of those attributes is named __dict__ , the instance does not allow arbitrary attributes and no dict is allocated. On the other hand, attributes defined by __slots__ have to be stored somewhere . This is done by storing the PyObject pointers for the values of those attributes directly in the PyObject struct, much like is done with types written in C. Each entry in __slots__ will thus take up a PyObject * , regardless of whether the attribute is set or not.

All that said, the problem remains that since everything in Python is an object and everything that holds an object just holds a reference, it's sometimes very difficult to draw the line between objects. Two objects can refer to the same bit of data. They may hold the only two references to that data. Getting rid of both objects also gets rid of the data. Do they both own the data? Does only one of them, but if so, which one? Or would you say they own half the data, even though getting rid of one object doesn't release half the data? Weakrefs can make this even more complicated: two objects can refer to the same data, but deleting one of the objects may cause the other object to also get rid of its reference to that data, causing the data to be cleaned up after all.

Fortunately the common case is fairly easy to figure out. There are memory debuggers for Python that do a reasonable job at keeping track of these things, like heapy . And as long as your class (and its baseclasses) is reasonably simple, you can make an educated guess at how much memory it would take up -- especially in large numbers. If you really want to know the exact sizes of your datastructures, consult the CPython source; most builtin types are simple structs described in Include/<type>object.h and implemented in Objects/<type>object.c . The PyObject struct itself is described in Include/object.h . Just keep in mind: it's pointers all the way down; those take up room too.

satoru ,Oct 31, 2010 at 12:43

Thanks very much. In fact, I'm asking this question because I want to know what's stored in memcached when I invoke cache.set(key, obj) , is it some thing like a pickled object? – satoru Oct 31 '10 at 12:43

Thomas Wouters ,Oct 31, 2010 at 16:00

Oh, well! That's a completely different question. As I recall (and a quick glance at the source confirms), the memcache module stores pickled versions of the object, yes. It also creates a new pickler for each store, so storing two objects that refer to the same third object means the third object is pickled twice (unless your objects don't pickle that way, of course; you can define pickling exactly how you want.) In other words, the answer to your question is 'len(pickle.dumps(obj))' . – Thomas Wouters Oct 31 '10 at 16:00

tmthydvnprt ,Mar 13, 2016 at 13:31

For the graphically curious, I once tested and plotted this for multiple builtin types: stackoverflow.com/a/30008338/2087463tmthydvnprt Mar 13 '16 at 13:31

> ,

in the case of a new class instance getsizeof() return the size of a reference to PyObject which is returned by the C function PyInstance_New()

if you want a list of all the object size check this .

[Dec 05, 2017] python - Problems installing python3 on RHEL - Stack Overflow

Dec 05, 2017 | stackoverflow.com

gecco ,Nov 13, 2011 at 13:53

It is easy to install it manually:
  1. Download (there may be newer releases on Python.org ):
    $ wget https://www.python.org/ftp/python/3.4.3/Python-3.4.3.tar.xz
  2. Unzip
    $ tar xf Python-3.* 
    $ cd Python-3.*
  3. Prepare compilation
    $ ./configure
  4. Build
    $ make
  5. Install
    $ make install

    OR if you don't want to overwrite the python executable (safer, at least on some distros yum needs python to be 2.x, such as for RHEL6) - you can install python3.* as a concurrent instance to the system default with an altinstall :

    $ make altinstall

Now if you want an alternative installation directory, you can pass --prefix to the configure command.

Example: for 'installing' Python in /opt/local, just add --prefix=/opt/local .

After the make install step: In order to use your new Python installation, it could be, that you still have to add the [prefix]/bin to the $PATH and [prefix]/lib to the $LD_LIBRARY_PATH (depending of the --prefix you passed)

rajadhiraja ,Jul 9, 2012 at 17:58

You used: bzip2 -cd Python-3.2.2.tar.bz2 | tar xvf - This is also a simpler possibility: tar jxvf Python-3.2.2.tar.bz2 – rajadhiraja Jul 9 '12 at 17:58

dannysauer ,Oct 29, 2014 at 21:38

The bzip2 option to tar was -y on some early systems, before bzip2 was "officially" supported, and some systems that don't use GNU tar don't even have bzip2 support built-in (but may have bzip2 binaries). So depending on how portable things need to be, the bunzip2 -c command (or bzip2 -cd ) may be more portable. RHEL6, as in teh question, supports -j , so this is moot for the actual question. But for posterity... – dannysauer Oct 29 '14 at 21:38

Caleb ,Jan 8, 2015 at 20:39

I got a 301 (moved) into a 404 when using the bz2 tar. I changed it to .tgz and it downloaded fine. – Caleb Jan 8 '15 at 20:39

bnu ,Jun 3, 2016 at 13:10

if you get no acceptable C compiler found in $PATH when installing python refer to http://stackoverflow.com/questions/19816275/no-acceptable-c-‌​compiler-found-in-pa‌​th-when-installing-p‌​ythonbnu Jun 3 '16 at 13:10

Searene ,Nov 20, 2016 at 3:44

./configure --with-ensurepip=install to enable pip3 , or you won't have pip3 installed after compilation. – Searene Nov 20 '16 at 3:44

Samuel Phan ,Apr 26, 2014 at 23:30

Installing from RPM is generally better, because:
  • you can install and uninstall (properly) python3.
  • the installation time is way faster . If you work in a cloud environment with multiple VMs, compiling python3 on each VMs is not acceptable.
Solution 1: Red Hat & EPEL repositories

Red Hat has added Python 3.4 for CentOS 6 and 7 through the EPEL repository.

Unfortunately:

  • pip3 is not bundled in any RPM. You need to install it manually (see below).
  • pyvenv is bugged and doesn't work. You need to use virtualenv .
[EPEL] How to install Python 3.4 on CentOS 6 & 7
sudo yum install -y epel-release
sudo yum install -y python34

# Install pip3
sudo yum install -y python34-setuptools  # install easy_install-3.4
sudo easy_install-3.4 pip

# I guess you would like to install virtualenv or virtualenvwrapper
sudo pip3 install virtualenv
sudo pip3 install virtualenvwrapper

If you want to use pyvenv , you can do the following to install pip3 in your virtualenv:

pyvenv --without-pip my_env
curl https://bootstrap.pypa.io/get-pip.py | my_env/bin/python

But if you want to have it out-of-the-box, you can add this bash function (alias) in your .bashrc :

pyvenv() { /usr/bin/pyvenv --without-pip $@; for env in $@; do curl https://bootstrap.pypa.io/get-pip.py | "$env/bin/python"; done; }
Solution 2: IUS Community repositories

The IUS Community provides some up-to-date packages for RHEL & CentOS . The guys behind are from Rackspace, so I think that they are quite trustworthy...

https://ius.io/

Check the right repo for you here:

https://ius.io/GettingStarted/

[IUS] How to install Python 3.5 on CentOS 6
sudo yum install -y https://centos6.iuscommunity.org/ius-release.rpm
sudo yum install -y python35u python35u-pip

# I guess you would like to install virtualenv or virtualenvwrapper
sudo pip3.5 install virtualenv
sudo pip3.5 install virtualenvwrapper

Note: you have pyvenv-3.5 available out-of-the-box if you don't want to use virtualenv .

[IUS] How to install Python 3.5 on CentOS 7
sudo yum install -y https://centos7.iuscommunity.org/ius-release.rpm
sudo yum install -y python35u python35u-pip

# I guess you would like to install virtualenv or virtualenvwrapper
sudo pip3.5 install virtualenv
sudo pip3.5 install virtualenvwrapper

Note: you have pyvenv-3.5 available out-of-the-box if you don't want to use virtualenv .

Samuel Phan ,Jul 3, 2015 at 14:54

Fixed the IUS release package URL. they have updated the version, that's all. If they update the package again, you can check the link to their RPM from the webpage. – Samuel Phan Jul 3 '15 at 14:54

Samuel Phan ,Sep 7, 2015 at 9:01

As I said, the link in your answer contains non-printable unicode characters. When I copy/paste your link, here is what I see in VIM: https://dl.iuscommunity.org/pub/ius/stable/CentOS/6/x86_64/i‌​u<200c><200b>s-relea‌​se-1.0-14.iu‌​s.cent‌​os6.noarch.rpm Here is the unicode character: fileformat.info/info/unicode/char/200c/index.htm The URL in my original answer works, I've just tested it. – Samuel Phan Sep 7 '15 at 9:01

Loïc ,Sep 30, 2015 at 13:48

Using this solution, how would you then install pip for python34 ? – Loïc Sep 30 '15 at 13:48

Samuel Phan ,Oct 1, 2015 at 21:11

Very good question, I added a comment for that. It's the best I found. If you want to stick to RPM-based installation, you should use IUS repositories for CentOS 7. They provide a python34u-pip . – Samuel Phan Oct 1 '15 at 21:11

ILMostro_7 ,May 5 at 2:27

easy_install pip3 should work--or a variation of it--to get pip3 installed without needing to curl a specific URL that may or may not be there (anymore). – ILMostro_7 May 5 at 2:27

rsc ,Jul 29, 2012 at 11:15

In addition to gecco's answer I would change step 3 from:
./configure

to:

./configure --prefix=/opt/python3

Then after installation you could also:

# ln -s /opt/python3/bin/python3 /usr/bin/python3

It is to ensure that installation will not conflict with python installed with yum.

See explanation I have found on Internet:

http://www.hosting.com/support/linux/installing-python-3-on-centosredhat-5x-from-source

cababunga ,Feb 12, 2013 at 19:45

Why /opt ? /usr/local specifically exists for this purpose and that's where ./configure with no explicit --prefix will place it. – cababunga Feb 12 '13 at 19:45

rsc ,Feb 13, 2013 at 11:27

@cababunga As I wrote I have been influenced by reading tutorial from specified site. Nevertheless installing python in above way may be usable - it would be a lot easier to uninstall it (it looks like uninstall target for make is not provided). Also you could easily install various versions of python3 in specified separate directories under /opt and manually set which one to use or test. – rsc Feb 13 '13 at 11:27

Caleb ,Jan 8, 2015 at 21:24

You may also want to set up your PATH to contain the binaries folder. For me it was export PATH=$PATH:/opt/python3/binCaleb Jan 8 '15 at 21:24

Paul Draper ,Jan 30, 2014 at 7:52

Use the SCL repos.
sudo sh -c 'wget -qO- http://people.redhat.com/bkabrda/scl_python33.repo >> /etc/yum.repos.d/scl.repo'
sudo yum install python33
scl enable python27

(This last command will have to be run each time you want to use python27 rather than the system default.)

snacks ,Sep 24, 2014 at 13:23

After reading the redhat docs what I needed to do was either; scl enable python33 bash to launch a new shell which will be enabled for python 3 or scl enable python33 'python hello.py' which will run your python file using python 3 in the current shell – snacks Sep 24 '14 at 13:23

Nathan Basanese ,Aug 24, 2015 at 21:46

// , What more generic instructions would also allow the installation of Python 3.4? – Nathan Basanese Aug 24 '15 at 21:46

Florian La Roche ,Feb 3, 2013 at 8:53

You can download a source RPMs and binary RPMs for RHEL6 / CentOS6 from here

This is a backport from the newest Fedora development source rpm to RHEL6 / CentOS6

cababunga ,Feb 12, 2013 at 19:40

That's great. Thanks for your effort, Florian. Maybe running createrepo on those directories would make them even more useful for some people. – cababunga Feb 12 '13 at 19:40

lyomi ,Mar 21, 2014 at 15:18

What a relief. the rpm installed perfectly. – lyomi Mar 21 '14 at 15:18

Nathan Basanese ,Sep 3, 2015 at 20:45

// , How do we make a repository from that link? – Nathan Basanese Sep 3 '15 at 20:45

Nathan Basanese ,Sep 3, 2015 at 21:07

// , I can confirm that this works. Hold on, I just whipped up something quick that used that URL as the baseurl : 0bin.net/paste/Nathan Basanese Sep 3 '15 at 21:07

rkuska ,Jul 16, 2015 at 7:58

Python3 was recently added to EPEL7 as Python34.

There is ongoing (currently) effort to make packaging guidelines about how to package things for Python3 in EPEL7.

See https://bugzilla.redhat.com/show_bug.cgi?id=1219411
and https://lists.fedoraproject.org/pipermail/python-devel/2015-July/000721.html

Nathan Basanese ,Aug 24, 2015 at 21:57

// , What's the hold-up? Pip seems like the simple way to go. – Nathan Basanese Aug 24 '15 at 21:57

Mike Guerette ,Aug 27, 2015 at 13:33

Along with Python 2.7 and 3.3, Red Hat Software Collections now includes Python 3.4 - all work on both RHEL 6 and 7.

RHSCL 2.0 docs are at https://access.redhat.com/documentation/en-US/Red_Hat_Software_Collections/

Plus lot of articles at developerblog.redhat.com.

edit

Follow these instructions to install Python 3.4 on RHEL 6/7 or CentOS 6/7:
# 1. Install the Software Collections tools:
yum install scl-utils

# 2. Download a package with repository for your system.
#  (See the Yum Repositories on external link. For RHEL/CentOS 6:)
wget https://www.softwarecollections.org/en/scls/rhscl/rh-python34/epel-6-x86_64/download/rhscl-rh-python34-epel-6-x86_64.noarch.rpm
#  or for RHEL/CentOS 7
wget https://www.softwarecollections.org/en/scls/rhscl/rh-python34/epel-7-x86_64/download/rhscl-rh-python34-epel-7-x86_64.noarch.rpm

# 3. Install the repo package (on RHEL you will need to enable optional channel first):
yum install rhscl-rh-python34-*.noarch.rpm

# 4. Install the collection:
yum install rh-python34

# 5. Start using software collections:
scl enable rh-python34 bash

Nathan Basanese ,Dec 10, 2015 at 23:53

// , Doesn't this require us to enable a special shell? Combined with virtualenvs, I can see that becoming a pain in the ass. – Nathan Basanese Dec 10 '15 at 23:53

Nathan Basanese ,Dec 10, 2015 at 23:55

// , Why does this require scl enable rh-python34 bash ? What are the implications for using this later on? – Nathan Basanese Dec 10 '15 at 23:55

Searene ,Nov 20, 2016 at 2:53

Is there a way to install python3.5 on RedHat 6? I tried wget https://www.softwarecollections.org/en/scls/rhscl/rh-python3‌​5/epel-6-x86_64/down‌​load/rhscl-rh-python‌​35-epel-6-x86_64.noa‌​rch.rpm , but it was not found. – Searene Nov 20 '16 at 2:53

daneel ,Apr 2, 2015 at 14:12

If you want official RHEL packages you can use RHSCL (Red Hat Software Collections)

More details:

You have to have access to Red Hat Customer Portal to read full articles.

Nathan Basanese ,Aug 24, 2015 at 21:55

// , Just upvoted. Would you be willing to make a summary of what one does to use the RHSCL for this? This is a question and answer site, after all. – Nathan Basanese Aug 24 '15 at 21:55

amphibient ,Feb 8 at 17:12

yum install python34.x86_64 works if you have epel-release installed, which this answer explains how to, and I confirmed it worked on RHEL 7.3
$ cat /etc/*-release
NAME="Red Hat Enterprise Linux Server"
VERSION="7.3 (Maipo)

$ type python3
python3 is hashed (/usr/bin/python3)

Aty ,Feb 11 at 20:47

Here are the steps i followed to install Python3:

yum install wget

wget https://www.python.org/ftp/python/3.6.0/Python-3.6.0.tar.xz

sudo tar xvf Python-3.*

cd Python-3.*

sudo ./configure --prefix=/opt/python3

sudo make

sudo make install

sudo ln -s /opt/python3/bin/python3 /usr/bin/python3

$ /usr/bin/python3

Python 3.6.0

Nagev ,Mar 6 at 18:21

Three steps using Python 3.5 by Software Collecti