After Westworld's enthralling and multifaceted pilot expectations were high.  So confident were HBO in their new Sci-Fi-Western, based on the Michael Crichton's 1973 movie of the same name, that they released episode 2 two days early on their HBO-Go.  While audiences without the app waited with baited breath, HBO rested it's spurs and sipped some whisky.  They knew what they had.

Episode two opens not unlike it's progenitor.  We get more of a peak behind Westworld's curtain as we arrive with two new guests, Logan and William (Ben Barnes and Jimmi Simpson, respectively).  While Logan very much reflects the debauched nature a world without consequence can expose, William is unable to hide his cautiously reserved constitution.  Of course there's no prizes for putting together the exploration of the human condition, particularly what a world with no repercussions could stir in our primitive souls, but our time with these guests is surprisingly fleeting for the impact it leaves.  Much like the first episode, the victorious double-barrel delivery of Westworld is it's deft ability to entertain while drip-feeding the lofty notions the world presents.  

Better yet is the continued existential thread spun by Dolores' father, Abernathy.  Dolores' new sporadic behavior is both puzzling and disquieting.  If the disembodied voice asking "Do you remember?" isn't enough she has started to repeat her father's whispered proclamation of "These violent delights have violent ends."  When whispered to Thandie Newton's Maeve the effects are sudden.  Whether it's Dolores or the phrase itself, the question is raised if a glitch or indeed an awakening could be contagious? If an artificial intelligence can question it's existence, is it indeed artificial?  Is there a difference?  Who's to say?  It's a subversive rabbit-hole that you may not willingly step into, but you'll find yourself tumbling into nonetheless.

If you manage not to avoid your very own ponderous-malfunction during the running time, even more seeds are sown regarding the treatment of the hosts.  Radiohead's "No Suprises" chimes away in the saloon just before Maeve begins to experience her own breakdown, catalysed by emerging, traumatic memories from a previous role.  If this is a subtextual portent more moral questions raise their android heads.  One instrument of control is the concept of nightmares.  It's explained away as a safe guard against an employee forgetting to reset one of the park's robotic hosts for the next day, but more Asimov-ian dilemmas rush forward.  Is a machne mistreated if it's built for that purpose, no matter how perverse?  Are creators responsible for machine's well-being?  "Can you imagine how fucked we'd be if these poor assholes ever remembered what the guests do to them?" one scientist nonchalantly asks the other.  We may not want to find out...

5* - Do androids dream of post-modern suites

Further Reading: Westworld Ep1: The Original

 

Comment