Oh Snap! Shutter Sound Regulations aren’t that Sound

By: Bharat Manwani and Naghulan Sudhaharan

Upskirt Photography: A Voyeuristic Parallel

Back in 2008, in the midst of the Inaugural Season of the Indian Premier League (“IPL”), renowned Indian actress Mandira Bedi vocally disapproved of cameras going up the skirts of cheerleaders. The IPL is infamous for sleazily recording women and invading their privacy, but this is not just about the world’s most famous cricketing league. It brings notice to the much broader problem of upskirt photography, often referred to as “Upskirting.” Upskirting usually involves capturing non-consensual photos or videos of the area inside the skirt from under the garment or even while it is being worn. Upskirt is a form of voyeurism, an offence defined under Section 354C of the Indian Penal Code as “any man who watches, or captures the image of a woman engaging in a private act in circumstances where she would usually have the expectation of not being observed.” The Criminal Law (Amendment) Act 2013 clarifies that an upskirt photographer can be punished under Section 354C of the Indian Penal Code. Section 66E of the Information Technology Act 2000 grants additional protection against capturing, publishing, or transmitting images of private areas.

The Evolution of Shutter Sound Regulations

The United States of America, however, antithetically differs upon the voyeuristic nature of upskirt photography. A Court in Washington, D.C. bizarrely upheld the right to take upskirt photographs. The D.C. Penal Code defines voyeurism to be unlawful under circumstances where an individual has a “reasonable expectation of privacy,” and since it was a public place where the defendant took upskirt photographs, the court held that the victim shall not expect privacy in such a circumstance. There were no charges pressed against him. This outlook was not restricted to Washington, D.C., as the Massachusetts Supreme Juridical Court and the Texas Court of Criminal Appeals took up similar perspectives. In the midst of these controversies, a New York representative introduced The Camera Phone Predator Alert Act, a legislation that attempted to curb upskirt photography which unfortunately died in Congress. The idea behind the bill was similar to most shutter sound regulations: it required mobile phones with cameras to make an audible sound within a reasonable radius of the phone so it would alert the others around the device if they were being snapped without their consent. The bill’s failure might have been a result of the country’s general reluctance to curb upskirt photography, as it infringes upon other’s civil liberties. Japan, on the other hand, is unbridled from such restraints and has codified shutter soundregulations. It all started with cell phone manufacturers voluntarily banning users from disabling the shutter sound in light of the rising upskirt photography incidents. Apple, which has always deferred from making region-specific modifications, had also joined the wave. Upskirt photography has formally been prohibited under the Anti-Nuisance Ordinance in Japan. Their courts continue to punish the perpetrators for offences committed more than 15 years ago. The legislation employs a comprehensive approach to safeguard only one particular aspect: the safety of women. Despite its broad language, the law unequivocally declares that the subject matter of the photograph is inconsequential. If the act of being photographed causes a woman to experience any discomfort or anxiety, the person behind the camera could be punished under the law. In fact, merely aiming a camera towards the victim, even without capturing an image, is sufficient grounds for arrest. With similar shutter sound regulations imposed across various jurisdictions, invites the question if they have been effective in curbing upskirt photography?

Skirting the Law

The regulations did not mark the end of upskirt photographs, as the photographs are still notoriously famous and reach a wide audience across the globe. Conceptually, the enactment meets the object of the legislation: the sound of the shutter would immediately alert individuals nearby that they are being photographed. But it is only the conceptual realm within which such a legislation is effective. The Anti-Nuisance Ordinance in Japan has not found any success. Contrarily, there has been a surge of over 4000 cases of upskirt photography in the country, as compared to the figure of 1700 such instances reported a decade ago. It is arguable that the introduction of a shutter sound has led to an increase in alertness of individuals and hence a higher reporting of such cases. However, similar trends across different parts of the world would indicate otherwise. The upward trend in instances of upskirt photography has raised doubts regarding the implementation of such legislation. The moot point being that it is simply not practical to enforce this mandate. Smartphones are not just limited to the units manufactured in Japan, and since these regulations are not in place for all countries that run the industry, the shutter sounds could only alert individuals if predators use units manufactured in Japan. While it makes no sense having a mandatory shutter sound for a unit made in Japan and not having such mandates for that same phone manufactured elsewhere, the Japanese units are additionally vulnerable to being overridden. Apple indeed joined the bandwagon with other phone manufacturers in banning users from disabling the shutter sound, yet years later it teaches users how to disable the shutter sound on its own support forum. It is shocking to see how the shutter sound mandates are being overridden. A report based in Osaka noted predators use third-party apps that mute the shutter sound, some of which were originally designed to take photos of babies while they are asleep. Social media giant Snapchat, which publicly detests upskirt photography, is hypocritically another third-party app that mutes the shutter sound on devices. In essence, shutter sound regulations apply to devices manufactured only in certain countries, yet even those devices have the ability evade these mandates which makes these legislations ineffective. An inefficacious framework ultimately takes a toll on women safety, making it easier to skirt these regulations and indulge in voyeuristic conduct.

The Way Forward

The current shutter sound regulations are being overridden, limiting their impact on curbing upskirt photography, and highlighting the need for more comprehensive and practical solutions to address these crimes. A mandatory shutter sound may not stop all crimes, but it would at the least alert nearby individuals. These regulations are theoretically infallible and the onus of imposing such mandates rests upon all cell phone manufacturers across the globe. In fact, it will not be the first time where all these firms within the industry have come together to agree upon imposing a manufacturing restraint. With no universal framework in force, the United States of America, following a vast majority of countries, has enacted legislation to disrupt SIM locking practices in 2014. Various nations united to offer greater consumer choice and made sure no telecom operator could bind devices to only one SIM provider. Women’s safety being a more pressing concern than consumer choices, cell phone manufacturers of the world would have no qualms to patch third-party loopholes, impose mandatory shutter sounds, and crackdown upskirt photography.

Cheaters Never Win, Bungie’s 4.3 Million Dollar Award Against AimJunkies

By: Perry Maybrown

Does anyone else remember being a kid, getting stuck on that super hard level and having to insert a Game Genie or GameShark into their machine to activate cheats? Apparently Nintendo really did not like these types of add-ons, so they sued the company that made them, and lost. But in today’s internet age, cheating has gotten more sophisticated, and much more illegal. 

Earlier this month, AimJunkies.com, a website that offers video game cheats for sale, was ordered by an arbitration judge to pay Bungie 4.3 million dollars after being sued for copyright infringement and violations of the Digital Millennium Copyright Act (DMCA). Bungie followed up by filing a motion with the court, to affirm this monetary award. 

Bungie brought nine claims in its original complaint filed in 2021 . The company argued that AimJunkies had infringed upon their copyright of Destiny 2, by “copying, producing, preparing unauthorized derivative works from, distributing and/or displaying Destiny 2 publicly all without Bungie’s permission.” Under those same facts, AimJunkies also infringed upon Bungie’s Destiny 2 trademarks. Furthermore, by making use of Bungie’s trademarks and copyrights, AimJunkies was also accused of false designation of origin. As well as two separate DMCA violations, breach of contract, tortious interference, consumer protection act violations and unjust enrichment. 

Later in 2022, a court dismissed Bungie’s copyright infringement allegations, for failure to state a claim. And while the copyright issue was dismissed, the door was still left open by the judge for Bungie to refile the claim later, with more evidence. Which Bungie unsurprisingly did. The court later upheld this amended complaint when AimJunkies once again tried to get it dismissed. 

After substantial litigation, Bungie decided to change tactics. Rather than attacking AimJunkies in court, they would drag them into mandatory arbitration.  This was accomplished by citing Destiny 2’s user agreement. Whenever a user plays games online, they are usually required to sign some kind of user agreement, which almost alway includes a mandatory arbitration agreement. Arbitration is a process that happens out of court, where two sides argue their case to a neutral arbitrator (usually a retired judge). There is a lot of controversy surrounding mandatory arbitration. For one thing, the “neutral” party deciding the case is usually paid/hired by whatever company included the arbitration agreement in the first place. It’s also difficult to overturn an arbitration agreement, as decisions can only be challenged for a limited number of very specific issues. 

A judge agreed to allow claims four through nine in Bungie’s lawsuit to be decided by mandatory arbitration. This meant that JAMS, one of the world’s largest private alternative dispute resolution providers, would be overseeing these six claims. Thus beginning the long, 9 month, process of arbitration. Bungie won and was awarded more than $4 million in damages as reported by TorrentFreak

So how did the arbitrator reach that huge number? It’s mostly thanks to the DMCA. The DMCA is an amendment to the copyright act from 1998 which seeks to address the relationship between the internet and copyright. The DMCA includes a section also referred to as the anti-circumvention law, which makes it illegal to knowingly circumvent a copyrighted work’s electronic security measures. For example, most video games have some kind of security measure, or Digital Rights Management (DRM) that stops users from getting into the source code. But some bad actors will sneak around these protections, so they can get a peek into the code. This allows those same bad actors unfettered access to the games, making it possible for them to reverse engineer different systems or download the game itself and share it. In this case the arbitrator found that AimJunkies had circumvented Destiny 2’s DRM to see the code and develop an effective cheat. Following § 1203 of the DMCA, the arbitrator awarded Bungie 2,500 per violation. With  102 violations, that meant AimJunkies was fined $255,000 for just one of Bungie’s six claims. 

Additionally, by hosting the cheats and selling them, the arbiter found that AimJunkies was in violation of the anti-trafficking provisions of the DMCA. This is where the costs really start to stack up. Just like the previous issue, Bungie was granted $2,500 per violation. With more than  1,316 copies of the Destiny 2 cheat sold, AimJunkies faced a whopping  $3,402,500.00 in anti-trafficking violations.

Finally, Bungie was awarded a further $738,722 in costs and attorney’s fees after proving AimJunkies had committed spoliation, the intentional destruction of evidence. This was found on the grounds that AimJunkies failed to keep proper financial records even after receiving a cease and desist letter from Bungie, which the arbitrator found to be a purposeful choice.

While this is a huge win for Bungie, the war is not over. As of February 28, 2023, AimJunkies is attempting to contest the arbitration decision. While it is unclear whether they will succeed, it’s a good lesson for us all. Just like the old saying goes, cheaters never win.

Talking to Machines – The Legal Implications of ChatGPT

By: Stephanie Ngo

Chat Generative Pre-trained Transformer, known as ChatGPT, was launched on November 30, 2022.  The program has since swept the world by storm with its articulate answers and detailed responses to a multitude of questions. A quick Google Search of “chat gpt” amasses approximately 171 million results. Similarly, in the first five days of launch, more than a million people had signed up to test the chatbot, according to OpenAI’s president, Greg Brockman. But with new technology comes legal issues that require legal solutions. As ChatGPT continues to grow in popularity, it is now more important than ever to discuss how such a smart system could affect the legal field. 

What is Artificial Intelligence? 

Artificial intelligence (AI), per John McCarthy, a world-renowned computer scientist at Stanford University, is “the science and engineering of making intelligent machines, especially intelligent computer programs, that can be used to understand human intelligence.” The first successful AI program was written in 1951 to play a game of checkers, but the idea of “robots” taking on human-like characteristics has been traced back even earlier. Recently, it has been predicted that AI, although prominent now, will permeate the daily lives of individuals by 2025 and seep into various business sectors.  Today, the buzz around AI stems from the fast-growing influx of  emerging technologies, and how AI can be integrated with current technology to innovate products like self-driving cars, electronic medical records, and personal assistants. Many are aware of what “Siri” is, and consumers’ expectations that Siri will soon become all-knowing is what continues to push the field of AI to develop at such fast speeds.

What is ChatGPT? 

ChatGPT is a chatbot that uses a large language model trained by OpenAI. OpenAI is an AI research and deployment company founded in 2015 dedicated to ensuring that artificial intelligence benefits all of humanity. ChatGPT was trained with data from items such as books and other written materials to generate natural and conversational responses, as if a human had written the reply. Chatbots are not a recent invention. In 2019, Salesforce reported that twenty-three percent of service organizations used AI chatbots. In 2021, Salesforce reported the percentage is now closer to thirty-eight percent of organizations, a sixty-seven percent increase since their 2018 report. The effectiveness, however, left many consumers wishing for a faster, smarter way of getting accurate answers.

In comes ChatGPT, which has been hailed as the “best artificial intelligence chatbot ever released to the general public” by technology columnist, Kevin Roose from the New York Times. ChatGPT’s ability to answer extremely convoluted questions, explain scientific concepts, or even debug large amounts of code is indicative of just how far chatbots have advanced since their creation. Prior to ChatGPT, answers from chatbots were taken with a grain of salt because of the inaccurate, roundabout responses that were likely programmed from a template. ChatGPT, while still imperfect and slightly outdated (its knowledge is restricted to information from before 2021), is being used in manners that some argue could impact many different occupations and render certain inventions obsolete.

The Legal Issues with ChatGPT

ChatGPT has widespread applicability, being touted as rivaling Google in its usage. Since the beta launch in November, there have been countless stories from people in various occupations about ChatGPT’s different use cases. Teachers can use ChatGPT to draft quiz questions. Job seekers can use it to draft and revise cover letters and resumes. Doctors have used the chatbot to diagnose a patient, write letters to insurance companies,  and even do certain medical examinations. 

On the other hand, ChatGPT has its downsides. One of the main arguments against ChatGPT is that the chatbot’s responses are so natural that students may use it to shirk their homework or plagiarize. To combat the issue of academic dishonesty and misinformation, OpenAI has begun work on accompanying software and training a classifier to distinguish between AI-written text and human-written text. While not wholly reliable, OpenAI has noted the classifier will become more reliable the longer it is trained.

Another argument that has arisen involves intellectual property issues. Is the material that ChatGPT produces legal to use? In a similar situation, a different artificial intelligence program, Stable Diffusion, was trained to replicate an artist’s style of illustration and create new artwork based upon the user’s prompt. The artist was concerned that the program’s creations would be associated with her name because the training used her artwork.

Because of how new the technology is, the case law addressing this specific issue is limited. In January 2023, Getty Images, a popular stock photo company, commenced legal proceedings against Stability AI, the creators of Stable Diffusion, in the High Court of Justice in London, claiming Stability AI had infringed on intellectual property rights in content owned or represented by Getty Images absent a license and to the detriment of the content creators. A group of artists have also filed a class-action lawsuit against companies with AI art tools, including Stable AI, alleging the violation of rights of millions of artists. Regarding ChatGPT, when asked about any potential legal issues, the chatbot stated that “there should not be any legal issues” as long as the chatbot is used according to the terms and conditions set by the company and with the appropriate permissions and licenses needed, if any. 
Last, but certainly not least, ChatGPT is unable to assess whether the chatbot itself is compliant with the protection of personal data under state privacy laws, as well as the European Union’s General Data Protection Regulation (GDPR). Known by many as the gold-standard of privacy regulations, ChatGPT’s lack of privacy compliance with the GDPR or any privacy laws could have serious consequences if a user feeds ChatGPT sensitive information. OpenAI’s privacy policy does state that the company may collect any communication information that a user communicates with the feature, so it is important for anyone using ChatGPT to pause and think about the impact that sharing information with the chatbot will have before proceeding. As ChatGPT improves and advances, the legal implications are likely to only grow in turn.

Are 3D printed human organs a possibility in the near future?

By: Aminat Sanusi

Medically 3D printed human organs have the possibility to save many lives. The United Network for Organ Sharing controls the American transplant system and lists patients in need of an organ transplant. Procedures such as kidney and liver transplants are possible with living donors. But patients on the list for transplants of the heart and lungs are not so lucky. Imagine the infinite possibilities of being able to print a human organ to save a life, instead of waiting until someone died to use theirs? With constant innovation in medicine and the legal field trying to keep up, maybe in this decade or the next, medical trials of 3D printed organs will be a success.

In 2020, the average kidney transplant cost $442,500 and 3D printers cost up to $100,000. The expensive costs of organ transplant surgery come from the transport costs and the actual surgery of implanting the organ. Affordability and insurance coverage issues may arise from time to time but nothing extremely unusual from a normal organ transplant. Nevertheless, accessibility wouldn’t be a huge issue because the organ is created with the patient’s own cells versus a living or non-living organ donor.

What are the current regulations of 3D printed medical devices?

Medical 3D printing has already enhanced treatment for certain medical conditions such as joint replacements and prosthetic limbs. The Food and Drug Administration (FDA) is currently in charge of the regulation of products made and used in the medical field by a 3D printer. The FDA regulates 3D medical devices by categorizing them into groups based on their levels of risk. Regulatory control increases from Class I to Class III, with Class I devices posing the lowest risk to patients. Some requirements apply to the medical devices before they are marketed (premarket requirements), and others apply to the medical devices after they are marketed (postmarket requirements). 

The FDA also regulates the information and application process that the 3D printed medical device seeking acceptance should include. In 2016, the FDA issued a draft guidance to assist manufacturers who are producing medical devices through 3D printing with design, manufacturing, and testing considerations. The guidance categorizes two major topic areas: design and manufacturing considerations which addresses the quality sy draft guidance tstem of the device, and device testing considerations which addresses the type of information that should be included in premarket notification submissions. The FDA continues to evaluate submissions of new 3D printed medical devices to determine its safety and effectiveness.

How are 3D printed organs made?

The possibility of printing 3D human organs is in the near future with organ bioprinting. According to a 2019 medical study, organ bioprinting is the use of 3D printing technologies to assemble multiple cell types, growth factors and biomaterial in a layer-by-layer fashion to produce bioartificial organs that ideally imitate their natural counterparts. The ability to recreate organs with the patient’s own cells is key to avoiding the risk of the patient rejecting the organ or dying before they could be matched with a healthy organ.

Dr. Anthony Atala, the director of the Wake Forest Institute for Regenerative Medicine, and Dr. Jennifer Lewis, a professor at Harvard University’s Wyss Institute for Biologically Inspired Engineering, discuss and explain the process of bioprinting. To begin the process of bioprinting an organ, the doctors need the patient’s cells, so they either choose to do a biopsy of an organ or surgically remove a piece of tissue from the patient’s body. Now the cells need to grow outside of the body, so it’s placed into an incubator that way it’s constantly fed nutrients. Next the cells are mixed with a gel which is similar to glue to create a printable mixture of living cells. Typically the gel is made out of collagen or gelatin. 

For the printing process, the 3D printer is programmed with the patient’s imaging data from X-rays or scans and then loaded with the bioink, which is the gel mixed with the patient’s cells, into the printing chamber to print the organ. Much similar to a regular printer that has cartridges filled with different colored ink, the 3D printer fills up its cartridges with cells. The printing process could take hours to weeks depending on the type of organ that is being printed.

As technological innovation becomes more successful and precise, 3D-printed organ transplants will likely become reality. However, there are current challenges involved with 3D bioprinted organ transplants. The first issue is the functioning of the 3D bioprinted organ is still undergoing testing and trials. The second issue is the uncertainty of how FDA regulations will control the manufacturing and testing of the 3D bioprinted organs. Lastly, the accessibility and affordability of the 3D printed organs is currently limited. 

3D bioprinted organs are created to be complex like a human organ and there are still many challenges to overcome with getting the printed organ to properly function alongside the other human organs in the body. It is still unclear how FDA regulations will be able to control the usage and safety of the product versus the manufacturing and engineering of the product. While there are already procedures in place for 3D printed medical devices like prosthetic limbs which could potentially be applied to bioprinted organs, the regulation of device testing may change because of the use of human cells to print the organs. 

So what comes next?

3D printed medical devices already exist. But why stop there? Why not 3D print human organs? In the award-winning American medical drama television series Grey’s Anatomy, the surgeon 3D printed a part of a human heart and surgically implanted it into the patient. Although the idea of it seems plausible on TV, the reality is a 3D printed human organ has yet to be implanted into a human body. However, that does not mean that 3D printing has not been utilized in the medical field.

Post-Dobbs: A Whole New World of Privacy Law

By: Enny Olaleye

Last summer, The United States was rocked by the U.S. Supreme Court’s (SCOTUS) ruling in Dobbs v. Jackson Women’s Health Organization, a landmark decision striking down the right to abortion, thereby overruling both Roe v. Wade and Planned Parenthood v. Casey. In its wake, the Dobbs decision left many questioning whether their most sensitive information—information relating to their reproductive health care—would remain private. Dobbs set in motion a web of state laws which make having, providing, or aiding and abetting the provision of abortion a criminal offense, and many now fear that enforcing those laws will require data tracking. Private groups and state agencies ranging from the health tech sector to hospitality industries may be asked to turn over data as a form of cooperation or a part of their prosecution of these new crimes. 

Thus, the question arises: Exactly how much of my information is actually private?

When determining one’s respective right to privacy, it is important to consider what “privacy” actually is. Ultimately, the scope of privacy is wide-ranging. Some may consider the term by its literal definition, where privacy is the quality or state of being apart from company or observation. Alternatively, some may conceptualize privacy a bit further and view privacy as 

a dignitary right focused on knowledge someone may or may not possess about a person. Others may not view privacy by its definition at all, but rather cement their views in the belief that a person’s private information should be free from public scrutiny and that all people have a right to be left alone. 

Regardless of one’s opinions on privacy, it is important to understand that, with respect to the U.S Constitution, you have no explicitly recognized right to privacy. 

How could that be possible?  Some may point to the First Amendment, which preserves a person’s rights of speech and assembly or perhaps the Fourth Amendment, which restricts the government’s intrusion into people’s private property and belongings. However, these amendments focus more on a specific right to privacy with respect to freedom and liberty, with the goal of limiting government interference. They do not constitute an explicit, overarching constitutional right to privacy. While the right to privacy is not specifically listed in the Constitution, the Supreme Court has recognized it as an outgrowth of protections for individual liberty. 

In Griswold v, Connecticut, the Supreme Court concluded that people have privacy rights that prevent the government from forbidding married couples from using contraception. Such a ruling first identified people’s right to independently control the most personal aspects of their lives—thus creating an implicit right to privacy. Later, the Court extended this right of privacy to include a woman’s right to have an abortion in Roe v Wade, holding that “the right of decisional privacy is based in the Constitution’s assurance that people cannot be ‘deprived of life, liberty or property, without due process of law.’” The Roe decision was largely made by the notion that the 14th Amendment contains an implicit right to privacy, as well as protects against state interference in a person’s private decisions more generally. However, the Dobbs ruling has now dismissed this precedent, with the implicit right of privacy no longer extending to abortion. With a 6-3 majority, the Court reasoned that abortion lacked due process protection, as it was not mentioned in the Constitution and was outlawed in many states at the time of the Roe decision. 

Fast forward to today—some government entities have attempted to make progress in preserving an individual’s privacy, particularly in relation to their healthcare. The Biden administration released an executive order aimed at protecting access to abortion and treatment for pregnancy complications. Additionally, the Federal Trade Commission has started to implement federal privacy rules for consumer data, citing “a need to protect people’s right to seek healthcare information.” However, most of this progress centers on a misconception that “privacy” and “data protection” are the same thing. 

So, let’s set the record straight: privacy and data protection are not the same thing. 

While data protection does stem from the right to privacy, it mainly focuses on ensuring that data has been fairly processed. With the concept of privacy constantly being intertwined with freedom and liberty over the past few decades, it can be difficult for people to fully grasp which exactly of their information is private. The Dobbs majority pointed out a distinction between privacy and liberty, citing that “as to precedent, citing a broad array of cases, the Court found support for a constitutional ‘right of personal privacy.’ But Roe conflated the right to shield information from disclosure and to make and implement important personal decisions without governmental interference.” 

There is a valid concern that personal information, ranging from instant messages and location history to third-party app usage and digital records, can end up being subpoenaed or sold to law enforcement. In response to the Dobbs decision, the U.S. Department of Health and Human Services issued a guidance that unless a state law “expressly requires” reporting on certain health conditions, the HIPAA exemption for disclosure to law enforcement would not apply. However, some people may not realize that the application privacy agreements and HIPAA medical privacy rules are not automatically protected against subpoenas. Wholeheartedly, data brokers will not hesitate to sell to the highest bidder any and all personal information they have access to. 

“So now what?” 


Ultimately, the Dobbs decision serves as a rather harsh reminder of just how valuable our privacy is, and what can happen if we lose it. As some of us have already realized, companies, governments, and even our peers are incredibly interested in our private lives. With respect to protecting reproductive freedom, it is imperative to establish federal privacy laws that protect information related to health care from being handed over to law enforcement unless doing so is absolutely necessary to avert substantial public harm. While it is unfortunate that individuals are placed in positions where they are solely responsible for protecting themselves against corporate or governmental surveillance, it is imperative for everyone to remain vigilant and aware of where their information is going.