Click here for the first post, which contains the context of this series.
Item #31: Be defensive when iterating over arguments.
Consider
def normalize(X):
s = sum(X)
return [x / s for x in X]
normalize works as expected if X is a container and does not work as expected if X is a generator; this is because sum(X) exhausts the generator. Address this by checking whether X is a generator with iter(X) == X or isinstance(X, Iterator), where Iterator is imported from collections.abc.
Item #32: Consider generator expressions for large list comprehensions.
Let X be an extraordinarily large iterable. Then
for y in [f(x) for x in X]: pass
will load an extraordinarily large object into memory. On the other hand,
for y in (f(x) for x in X): pass
does not have this problem.
Item #33: Compose multiple generators with yield from.
def my_gen():
yield from gen_1()
yield from gen_2()
yield from gen_3()
is shorthand for and performs better than
def my_gen():
for i in gen_1():
yield i
for i in gen_2():
yield i
for i in gen_3():
yield i
Item #34: Avoid injecting data into generators with send.
Consider
def double_inputs():
while True:
x = yield
yield x * 2
gen = double_inputs()
next(gen)
print(gen.send(10))
next(gen)
print(gen.send(6))
next(gen)
print(gen.send(94.3))
>>>
20
12
188.6
Avoid doing this.
Item #35: Avoid causing state transitions in generators with throw.
Consider
def my_gen():
i = 0
while i < 10:
try:
i += 1
yield i
except GeneratorExit:
return
except BaseException:
i = -1
it = my_gen()
print(next(it))
print(next(it))
print(next(it))
it.throw(BaseException())
print(next(it))
>>>
1
2
3
1
Avoid doing this.