Most of us have 8 or 16 GBs of RAM on our laptops, which is enough to open a few million rows. Anything beyond this will end up using all the RAM, start"swapping"(i.e. using storage instead of memory - super slow!) and eventually crash your computer...
To load and process file of *any* size, load the file as smaller chunks using the chunksize parameter in Pandas:
def color_negative_red(val):
"""
Takes a scalar and returns a string with
the css property 'color: red' for negative
strings, black otherwise.
"""
color = 'red' if val < 0 else 'black'
return 'color: %s' % color
matrix.style.applymap(color_negative_red)
import model_evaluation_utils as meu
meu.display_confusion_matrix_pretty(true_labels=sentiment_category,
predicted_labels=sentiment_category_tb,
classes=['negative', 'neutral', 'positive'])