Day Three of the 18th annual QCon San Francisco conference was held on November 20th, 2024, at the Hyatt Regency in San Francisco, California. Key takeaways included: a debate on whether prompt engineering is a programming language or a utility; how Google Lens helps the visually impaired navigate the streets in Google Maps; challenges in migrating to a cellular architecture; and how to more properly implement high-resolution platform observability to avoid unintended consequences.
What follows is a summary of the keynote address and highlighted presentations.
Keynote Address: Prompt Engineering: Is it a New Programming Language?
Hien Luu, Sr. Engineering Manager at Zoox and Author of “MLOps with Ray,” presented his keynote address entitled, Prompt Engineering: Is it a New Programming Language?. Luu kicked off his presentation by stating:
The most powerful programming language isn’t a programming language at all.
He then proposed the question: “Is prompt engineering a new programming language or just word-smithing for those that write JavaScript?“
This keynote was designed to be a debate on this question with Luu serving as debate moderator and providing arguments both for and against each side. After defining the attributes of a programming language and prompt engineering, Huu conducted an initial vote on the motion from the audience that seemed to favor prompt engineering as a programming language.
The debate was focused on three topics:
- Syntax & Structure
- Skills & Expertise
- Impact & Longevity
Luu provided attributes, examples and ChatGPT demos both for and against each of these topics. For example: the statement, “I saw a man with the telescope,” could be interpreted as the observer: saw a man holding a telescope; or saw the man through the telescope. Luu also maintained that we rely on natural language instead of the skills and expertise.
After providing closing arguments, Luu once again polled the audience. This time, the audience seemed to favor that prompt engineering was not a programming language.
Highlighted Presentations: Accessibility with Augmented Reality | Slack Migration to a Cellular Architecture
Making Augmented Reality Accessible: A Case Study of Lens in Maps was presented by Ohan Oda, Senior Software Engineer at Google. Oda kicked off his presentation with statistics that: one out of four 20-year olds will become disabled before they retire, according to the Council for Disability Income Awareness; and that an estimated 1.3 billion world-wide suffer from a significant disability, according to the World Health Organization. His presentation focused on individuals who are blind and have low vision.
Oda introduced Google Lens, a “camera based experience in Google Maps that helps on-the-go users understand their surroundings and make decisions confidently by showing information in first-person perspective,” available in Google Maps. He demonstrated how to use Lens with this short video. While useful when traveling, Oda stated that Lens is not used very much in everyday situations. Of course, using Lens requires the user to hold up the phone while walking, which may cause “friction” that Oda defined as surrounding pedestrians with the perception of being recorded.
In his quest to improve on usage and user retention, Oda attended several internal accessibility/disability inclusion (ADI) sessions at Google and solicited feedback from some of the visually impaired employees. He also attended external conferences such as XR Access, a research consortium at Cornell Tech designed to provide virtual, augmented and mixed reality for people with disabilities.
Oda discussed the challenges in improving Google Lens including reversing the old adage, “a picture may be worth 1000 words.” However, he maintained “the user doesn’t have time to listen to 1000 words.“
Oda concluded with a video from Ross Minor, a gaming, media, and technology accessibility consultant, who is dedicated to providing accessibility to those that are disabled. Minor is blind due to a traumatic event at the age of eight.
Slack’s Migration to a Cellular Architecture was presented by Cooper Bethea, Former Senior Staff Engineer and Technical Lead at Slack. Bethea kicked off his presentation with a peek behind Slack’s architecture, the web servers behind the scenes and the corresponding data store.
He discussed the goals and challenges behind building a cellular design as an availability zone for “draining the traffic,” and discussed the two options of Siloing and Internal Managed Draining.
Bethea introduced Coordination Headwind, a concept in which organizations start to feel that accomplishing simple things seem to be much slower over time. He referred to this as organizations becoming slime molds and compared the bottom-up and top-down hierarchical designs.
Bethea then introduced Project Cadence that features: writing proposals and circulate; engage deeply with high-value services; and expand to all critical services.
The current state of the cellular design includes: siloed services are drainable in approximately 60 seconds; Vitess automation can reparent at the speed of replication; remaining critical services have roadmaps; there is a “happy path” to silo for new services; drains can happen for incident response, rollout and even drills.
Conclusion
QCon San Francisco, a five-day event, consisting of three days of presentations and two days of workshops, is organized by C4Media, a software media company focused on unbiased content and information in the enterprise development community and creators of InfoQ and QCon. For details on some of the conference tracks, check out these Software Architecture and Artificial Intelligence and Machine Learning news items.