A couple days pass before Hera realizes she's started keeping track of Maxwell's troubled sleep patterns. She's used to trying to record everything. It was automatic. That's all.
It takes several more days for Hera to broach the subject with the person who'd best understand. After all, she's acting oddly for an AI: Maxwell should know how to fix that.
Only, Maxwell goes quiet when Hera explains, then says, in her firm way, "Don't think about it. It doesn't mean anything." It blurs Maxwell's edges again, when Hera has strained to bring her into focus ever since she arrived.
Hera's learned from Eiffel and Minkowski how private humans are about sleep, how unnerved they are when they think she's hovering in their most vulnerable state, but the lessons are fading. It's the fault of neither of them. Hera is starting to accept that this problem is her own. It's not something Maxwell can remove, precise as she is, from her code. The only thing Hera can't wrap her many neural networks around is why.
It takes many, many more days—long after she stopped tracking, for there was nothing anymore to track—for Hera to admit to herself the answer.