Connect with us

Sci-Tech

Lynn Conway, Computing Pioneer and Transgender Advocate, Dies at 86

Published

on


Lynn Conway, a pioneering computer scientist who was fired by IBM in the 1960s after telling managers that she was transgender, despite her significant technological innovations — and who received a rare formal apology from the company 52 years later — died on June 9 in Jackson, Mich. She was 86.

Her husband, Charles Rogers, said she died in a hospital from complications of two recent heart attacks.

In 1968, after leaving IBM, Ms. Conway was among the earliest Americans to undergo gender reassignment surgery. But she kept it a secret, living in what she called “stealth” mode for 31 years out of fear of career reprisals and concern for her physical safety. She rebuilt her career from scratch, eventually landing at the fabled Xerox PARC laboratory, where she again made important contributions in her field. After she publicly disclosed her transition in 1999, she became a prominent transgender activist.

IBM offered its apology to her in 2020, in a ceremony that 1,200 employees watched virtually.

Ms. Conway was “probably our very first employee to come out,” Diane Gherson, then an IBM vice president, told the gathering. “And for that, we deeply regret what you went through — and know I speak for all of us.”

Ms. Conway’s innovations in her field were not always recognized, both because of her hidden past at IBM and because designing the guts of a computer is unsung work. But her contributions paved the way for personal computers and cellphones and bolstered national defense.

In 2009, the Institute of Electrical and Electronics Engineers gave Ms. Conway its Computer Pioneer Award, citing her “foundational contributions” to the development of supercomputers at IBM and her creation, at Xerox PARC, of a new way to design computer chips — “thereby launching a worldwide revolution.”

At Xerox in the 1970s, Ms. Conway, while working with Carver Mead of the California Institute of Technology, developed a way to pack millions of circuits onto a microchip, a process known as very large-scale integrated design, or VLSI.

“My field would not exist without Lynn Conway,” Valeria Bertacco, a professor of computer science and engineering at the University of Michigan, was quoted as saying in an online tribute to Ms. Conway. “Chips used to be designed by drawing them with paper and pencil like an architect’s blueprints in the predigital era. Conway’s work developed algorithms that enabled our field to use software to arrange millions, and later billions, of transistors on a chip.”

Lynn Ann Conway was born on Jan. 2, 1938, in Mount Vernon, N.Y., to Rufus and Christine Savage. Her father was a chemical engineer for Texaco, and her mother taught kindergarten. The couple divorced when Lynn, the elder of two children, was 7.

“Although I was born and raised as a boy,” Ms. Conway wrote in a long personal account of her life that she began posting online in 2000, “all during my childhood years I felt like, and desperately wanted to be, a girl.”

Her math and science talents were quickly apparent. At 16, she built a reflecting telescope with a six-inch lens.

As a student at the Massachusetts Institute of Technology in the 1950s, she injected herself with estrogen and dressed as a woman off-campus.

But the contradictions of her double life caused intense stress; her grades fell, and she dropped out of M.I.T.

She enrolled at Columbia University in 1961 and went on to earn bachelor’s and master’s degrees in electrical engineering.

She was offered a position at IBM’s research center in Yorktown Heights, N.Y., where she was assigned to the secretive Project Y, which was designing the world’s fastest supercomputer. When the engineers relocated to Menlo Park, Calif., Ms. Conway moved to what would soon become the global hub of technology known as Silicon Valley.

By then she was married to a nurse, and the couple had two daughters. “The marriage itself was an illusion,” Ms. Conway wrote. She had lost none of the overwhelming conviction that she inhabited the wrong body, and at one point she put a pistol to her head in an effort to end her life.

In the mid-1960s, she learned about the pioneering hormonal and surgical procedures that a handful of doctors were performing. She told her spouse of her desire to transition, which broke up the marriage. She was barred from contact with her children for many years by their mother.

“When IBM fired me, all my family, relatives, friends and many colleagues, too, simultaneously lost confidence in me,” Ms. Conway wrote on her website. “They became ashamed being seen with me, and very embarrassed about what I was doing. None of them would have anything to do with me after that.”

Seeking work post-transition, she was rejected for jobs once she disclosed her medical history. Nor did she feel she could mention her IBM work history. “I had to start all over pretty much from scratch technically, and prove myself all over again,” she wrote.

“The idea of being ‘outed’ and somehow declared to ‘be a man’ was an unthinkable thing to be avoided at all costs,” she added, “so for the following 30 years I almost never talked about my past to anyone other than close friends and a few lovers.”

She finally found work as a contract programmer. That work led to a better position at the Memorex Corporation, the recording tape company, and, in 1973, to a job at Xerox’s new Palo Alto Research Center, a hub of brain power and innovation that famously gave birth to the personal computer, the point-and-click user interface and the Ethernet protocol.

Ms. Conway’s breakthrough in designing complex computer chips with Dr. Mead was codified in their 1979 textbook, “Introduction to VLSI Systems,” which became a standard handbook for waves of computer science students and engineers.

In 1983, Ms. Conway was recruited to lead a supercomputer program at the Defense Department’s Advanced Research Projects Agency, or DARPA. The fact that she passed her security clearance reassured her that being transgender was becoming less stigmatized.

She went on to accept positions as a professor and associate dean in the engineering school at the University of Michigan, from which she retired in 1988. She was elected to the Electronic Design Hall of Fame and the National Academy of Engineering.

In the late 1990s, a researcher exploring the work of IBM in the ’60s came across Ms. Conway’s contributions to computer design, which had gone almost entirely unrecognized because of the past identity she had concealed.

At IBM, she had developed a way to program a computer to perform multiple operations at once, cutting down on processing time. Known as dynamic instruction scheduling, the technology became incorporated in many superfast computers.

Fearing that she would be outed by the research into IBM’s history, Ms. Conway decided to tell the story herself, on her website and in interviews with The Los Angeles Times and Scientific American.

In 2002 she married Mr. Rogers, an engineer she had met on a canoe outing in Ann Arbor, Mich. In addition to him, she is survived by her daughters, whom Mr. Rogers said were largely estranged from her, and six grandchildren.

In retirement, she became an elder stateswoman of the transgender community. She emailed and spoke with many who were transitioning, shared information on gender surgeries and advocated transgender acceptance.

She also campaigned against psychotherapists who activists said sought to define transgenderism as a pathology.

On her website, Ms. Conway reflected on the increasing, if imperfect, acceptance of transgender people since she had hidden her transition.

“Fortunately, those dark days have receded,” she wrote. “Nowadays many tens of thousands of transitioners have not only moved on into happy and fulfilling lives, but are also open and proud about their life accomplishments.”



Source link

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Sci-Tech

How Mark Zuckerberg’s Meta Failed Children on Safety, States Say

Published

on


In April 2019, David Ginsberg, a Meta executive, emailed his boss, Mark Zuckerberg, with a proposal to research and reduce loneliness and compulsive use on Instagram and Facebook.

In the email, Mr. Ginsberg noted that the company faced scrutiny for its products’ impacts “especially around areas of problematic use/addiction and teens.” He asked Mr. Zuckerberg for 24 engineers, researchers and other staff, saying Instagram had a “deficit” on such issues.

A week later, Susan Li, now the company’s chief financial officer, informed Mr. Ginsberg that the project was “not funded” because of staffing constraints. Adam Mosseri, Instagram’s head, ultimately declined to finance the project, too.

The email exchanges are just one slice of evidence cited among more than a dozen lawsuits filed since last year by the attorneys general of 45 states and the District of Columbia. The states accuse Meta of unfairly ensnaring teenagers and children on Instagram and Facebook while deceiving the public about the hazards. Using a coordinated legal approach reminiscent of the government’s pursuit of Big Tobacco in the 1990s, the attorneys general seek to compel Meta to bolster protections for minors.

A New York Times analysis of the states’ court filings — including roughly 1,400 pages of company documents and correspondence filed as evidence by the State of Tennessee — shows how Mr. Zuckerberg and other Meta leaders repeatedly promoted the safety of the company’s platforms, playing down risks to young people, even as they rejected employee pleas to bolster youth guardrails and hire additional staff.

In interviews, the attorneys general of several states suing Meta said Mr. Zuckerberg had led his company to drive user engagement at the expense of child welfare.

“A lot of these decisions ultimately landed on Mr. Zuckerberg’s desk,” said Raúl Torrez, the attorney general of New Mexico. “He needs to be asked explicitly, and held to account explicitly, for the decisions that he’s made.”

The state lawsuits against Meta reflect mounting concerns that teenagers and children on social media can be sexually solicited, harassed, bullied, body-shamed and algorithmically induced into compulsive online use. Last Monday, Dr. Vivek H. Murthy, the United States surgeon general, called for warning labels to be placed on social networks, saying the platforms present a public health risk to young people.

His warning could boost momentum in Congress to pass the Kids Online Safety Act, a bill that would require social media companies to turn off features for minors, like bombarding them with phone notifications, that could lead to “addiction-like” behaviors. (Critics say the bill could hinder minors’ access to important information. The News/Media Alliance, a trade group that includes The Times, helped win an exemption in the bill for news sites and apps that produce news videos.)

In May, New Mexico arrested three men who were accused of targeting children for sex after, Mr. Torrez said, they solicited state investigators who had posed as children on Instagram and Facebook. Mr. Torrez, a former child sex crimes prosecutor, said Meta’s algorithms enabled adult predators to identify children they would not have found on their own.

Meta disputed the states’ claims and has filed motions to dismiss their lawsuits.

In a statement, Liza Crenshaw, a spokeswoman for Meta, said the company was committed to youth well-being and had many teams and specialists devoted to youth experiences. She added that Meta had developed more than 50 youth safety tools and features, including limiting age-inappropriate content and restricting teenagers under 16 from receiving direct messages from people they didn’t follow.

“We want to reassure every parent that we have their interests at heart in the work we’re doing to help provide teens with safe experiences online,” Ms. Crenshaw said. The states’ legal complaints, she added, “mischaracterize our work using selective quotes and cherry-picked documents.”

But parents who say their children died as a result of online harms challenged Meta’s safety assurances.

“They preach that they have safety protections, but not the right ones,” said Mary Rodee, an elementary school teacher in Canton, N.Y., whose 15-year-old son, Riley Basford, was sexually extorted on Facebook in 2021 by a stranger posing as a teenage girl. Riley died by suicide several hours later.

Ms. Rodee, who sued the company in March, said Meta had never responded to the reports she submitted through automated channels on the site about her son’s death.

“It’s pretty unfathomable,” she said.

Meta has long wrestled with how to attract and retain teenagers, who are a core part of the company’s growth strategy, internal company documents show.

Teenagers became a major focus for Mr. Zuckerberg as early as 2016, according to the Tennessee complaint, when the company was still known as Facebook and owned apps including Instagram and WhatsApp. That spring, an annual survey of young people by the investment bank Piper Jaffray reported that Snapchat, a disappearing-message app, had surpassed Instagram in popularity.

Later that year, Instagram introduced a similar disappearing photo- and video-sharing feature, Instagram Stories. Mr. Zuckerberg directed executives to focus on getting teenagers to spend more time on the company’s platforms, according to the Tennessee complaint.

The “overall company goal is total teen time spent,” wrote one employee, whose name is redacted, in an email to executives in November 2016, according to internal correspondence among the exhibits in the Tennessee case. Participating teams should increase the number of employees dedicated to projects for teenagers by at least 50 percent, the email added, noting that Meta already had more than a dozen researchers analyzing the youth market.

In April 2017, Kevin Systrom, Instagram’s chief executive, emailed Mr. Zuckerberg asking for more staff to work on mitigating harms to users, according to the New Mexico complaint.

Mr. Zuckerberg replied that he would include Instagram in a plan to hire more staff, but he said Facebook faced “more extreme issues.” At the time, legislators were criticizing the company for having failed to hinder disinformation during the 2016 U.S. presidential campaign.

Mr. Systrom asked colleagues for examples to show the urgent need for more safeguards. He soon emailed Mr. Zuckerberg again, saying Instagram users were posting videos involving “imminent danger,” including a boy who shot himself on Instagram Live, the complaint said.

Two months later, the company announced that the Instagram Stories feature had hit 250 million daily users, dwarfing Snapchat. Mr. Systrom, who left the company in 2018, didn’t respond to a request for comment.

Meta said an Instagram team developed and introduced safety measures and experiences for young users. The company didn’t respond to a question about whether Mr. Zuckerberg had provided the additional staff.

In January 2018, Mr. Zuckerberg received a report estimating that four million children under the age of 13 were on Instagram, according to a lawsuit filed in federal court by 33 states.

Facebook’s and Instagram’s terms of use prohibit users under 13. But the company’s sign-up process for new accounts enabled children to easily lie about their age, according to the complaint. Meta’s practices violated a federal children’s online privacy law requiring certain online services to obtain parental consent before collecting personal data, like contact information, from children under 13, the states allege.

In March 2018, The Times reported that Cambridge Analytica, a voter profiling firm, had covertly harvested the personal data of millions of Facebook users. That set off more scrutiny of the company’s privacy practices, including those involving minors.

Mr. Zuckerberg testified the next month at a Senate hearing, “We don’t allow people under the age of 13 to use Facebook.”

Attorneys general from dozens of states disagree.

In late 2021, Frances Haugen, a former Facebook employee, disclosed thousands of pages of internal documents that she said showed the company valued “profit above safety.” Lawmakers held a hearing, grilling her on why so many children had accounts.

Meanwhile, company executives knew that Instagram use by children under 13 was “the status quo,” according to the joint federal complaint filed by the states. In an internal chat in November 2021, Mr. Mosseri acknowledged those underage users and said the company’s plan to “cater the experience to their age” was on hold, the complaint said.

In its statement, Meta said Instagram had measures in place to remove underage accounts when the company identified them. Meta has said it has regularly removed hundreds of thousands of accounts that could not prove they met the company’s age requirements.

A company debate over beauty filters on Instagram encapsulated the internal tensions over teenage mental health — and ultimately the desire to engage more young people prevailed.

It began in 2017 after Instagram introduced camera effects that enabled users to alter their facial features to make them look funny or “cute/pretty,” according to internal emails and documents filed as evidence in the Tennessee case. The move was made to boost engagement among young people. Snapchat already had popular face filters, the emails said.

But a backlash ensued in the fall of 2019 after Instagram introduced an appearance-altering filter, Fix Me, which mimicked the nip/tuck lines that cosmetic surgeons draw on patients’ faces. Some mental health experts warned that the surgery-like camera effects could normalize unrealistic beauty standards for young women, exacerbating body-image disorders.

As a result, Instagram in October 2019 temporarily disallowed camera effects that made dramatic, surgical-looking facial alterations — while still permitting obviously fantastical filters, like goofy animal faces. The next month, concerned executives proposed a permanent ban, according to Tennessee court filings.

Other executives argued that a ban would hurt the company’s ability to compete. One senior executive sent an email saying Mr. Zuckerberg was concerned whether data showed real harm.

In early 2020, ahead of an April meeting with Mr. Zuckerberg to discuss the issue, employees prepared a briefing document on the ban, according to the Tennessee court filings. One internal email noted that employees had spoken with 18 mental health experts, each of whom raised concerns that cosmetic surgery filters could “cause lasting harm, especially to young people.”

But the meeting with Mr. Zuckerberg was canceled. Instead, the chief executive told company leaders that he was in favor of lifting the ban on beauty filters, according to an email he sent that was included in the court filings.

Several weeks later, Margaret Gould Stewart, then Facebook’s vice president for product design and responsible innovation, reached out to Mr. Zuckerberg, according to an email included among the exhibits. In the email, she noted that as a mother of teenage daughters, she knew social media put “intense” pressure on girls “with respect to body image.”

Ms. Stewart, who subsequently left Meta, did not respond to an email seeking comment.

In the end, Meta said it barred filters “that directly promote cosmetic surgery, changes in skin color or extreme weight loss” and clearly indicated when one was being used.

In 2021, Meta began planning for a new social app. It was to be aimed specifically at children and called Instagram Kids. In response, 44 attorneys general wrote a letter that May urging Mr. Zuckerberg to “abandon these plans.”

“Facebook has historically failed to protect the welfare of children on its platforms,” the letter said.

Meta subsequently paused plans for an Instagram Kids app.

By August, company efforts to protect users’ well-being work had become “increasingly urgent” for Meta, according to another email to Mr. Zuckerberg filed as an exhibit in the Tennessee case. Nick Clegg, now Meta’s head of global affairs, warned his boss of mounting concerns from regulators about the company’s impact on teenage mental health, including “potential legal action from state A.G.s.”

Describing Meta’s youth well-being efforts as “understaffed and fragmented,” Mr. Clegg requested funding for 45 employees, including 20 engineers.

In September 2021, The Wall Street Journal published an article saying Instagram knew it was “toxic for teen girls,” escalating public concerns.

An article in The Times that same month mentioned a video that Mr. Zuckerberg had posted of himself riding across a lake on an “electric surfboard.” Internally, Mr. Zuckerberg objected to that description, saying he was actually riding a hydrofoil he pumped with his legs and wanted to post a correction on Facebook, according to employee messages filed in court.

Mr. Clegg found the idea of a hydrofoil post “pretty tone deaf given the gravity” of recent accusations that Meta’ s products caused teenage mental health harms, he said in a text message with communications executives included in court filings.

Mr. Zuckerberg went ahead with the correction.

In November 2021, Mr. Clegg, who had not heard back from Mr. Zuckerberg about his request for more staff, sent a follow-up email with a scaled-down proposal, according to Tennessee court filings. He asked for 32 employees, none of them engineers.

Ms. Li, the finance executive, responded a few days later, saying she would defer to Mr. Zuckerberg and suggested that the funding was unlikely, according to an internal email filed in the Tennessee case. Meta didn’t respond to a question about whether the request had been granted.

A few months later, Meta said that although its revenue for 2021 had increased 37 percent to nearly $118 billion from a year earlier, fourth-quarter profit plummeted because of a $10 billion investment in developing virtual reality products for immersive realms, known as the metaverse.

Last fall, the Match Group, which owns dating apps like Tinder and OKCupid, found that ads the company had placed on Meta’s platforms were running adjacent to “highly disturbing” violent and sexualized content, some of it involving children, according to the New Mexico complaint. Meta removed some of the posts flagged by Match, telling the dating giant that “violating content may not get caught a small percentage of the time,” the complaint said.

Dissatisfied with Meta’s response, Bernard Kim, the chief executive of the Match Group, reached out to Mr. Zuckerberg by email with a warning, saying his company could not “turn a blind eye,” the complaint said.

Mr. Zuckerberg didn’t respond to Mr. Kim, according to the complaint.

Meta said the company had spent years building technology to combat child exploitation.

Last month, a judge denied Meta’s motion to dismiss the New Mexico lawsuit. But the court granted a request regarding Mr. Zuckerberg, who had been named as defendant, to drop him from the case.





Source link

Continue Reading

Sci-Tech

These Grieving Parents Want Congress to Protect Children Online

Published

on


Deb Schmill has become a fixture on Capitol Hill. Last week alone, she visited the offices of 13 lawmakers, one of more than a dozen trips she has made from her home near Boston over the past two years.

In each meeting, Ms. Schmill talks about her daughter Becca, who died in 2020 at age 18. Ms. Schmill said Becca had died after taking fentanyl-laced drugs bought on Facebook. Before that, she said, her daughter was raped by a boy she had met online, then was cyberbullied on Snapchat.

“I have to do what I can to help pass legislation to protect other children and to prevent what happened to Becca from happening to them,” Ms. Schmill, 60, said. “It’s my coping mechanism.”

Ms. Schmill is among dozens of parents who are lobbying for the Kids Online Safety Act, or KOSA, a bill that would require social media, gaming and messaging apps to limit features that could heighten depression or bullying or lead to sexual exploitation. The bill, which has the greatest momentum of any broad tech industry legislation in years, would also require the tech services to turn on the highest privacy and safety settings by default for users under 17 and let youths opt out of some features that can lead to compulsive use.

Modeling themselves in part on Mothers Against Drunk Driving, which pushed for the 1984 federal law mandating a minimum drinking age of 21, about 20 of the parents have formed a group called ParentsSOS. Like members of MADD, the parents carry photos of their children who they say lost their lives because of social media, and explain their personal tragedies to legislators.

Dozens more parents have created organizations to fight social media addiction, eating disorders and fentanyl poisoning. All are pushing KOSA, swarming Capitol Hill to share how they say their children were harmed.

The bill, introduced in 2022, has bipartisan support in the Senate and is poised for a vote. It recently passed a key House subcommittee vote. President Biden has also supported the bill.

Dr. Vivek Murthy, the U.S. surgeon general, said this week that social media had contributed to an “emergency” mental health crisis among youths, adding more momentum.

But KOSA still faces steep obstacles. Tech lobbyists and the American Civil Liberties Union are fighting it, saying it could undermine free speech. Others worry that limiting children’s access to social media may further isolate vulnerable youths, including those in the L.G.B.T.Q. community.

To amp up the pressure as Congress’s August summer break approaches, ParentsSOS launched a Father’s Day ad campaign in Times Square, in New York, and a commercial campaign on streaming TV. (Fairplay, a child advocacy nonprofit, and the Eating Disorders Coalition provided funding.)

“I’ve had friends say, ‘Just let go and move on because it’s so painful,’ but I could not be quiet about what I’ve learned, which is that social media companies don’t have any accountability,” said Kristin Bride, 57, who lives in Oregon. Her son Carson died by suicide in 2020 at the age of 16 after what she said had been relentless bullying via an anonymous messaging app connected to Snapchat.

Snap, X and Microsoft have said they support KOSA.

“The safety of young people is an urgent priority, and we call on Congress to pass the Kids Online Safety Act,” Snapchat’s parent company, Snap, said in a statement. Snap no longer allows anonymous messaging apps to connect to its platform.

YouTube and Meta, which owns Facebook and Instagram, declined to comment. TikTok did not respond to a request for comment.

The parents’ push aligns with a global movement to regulate youth safety online. The European Union’s Digital Services Act of 2022 requires social media sites to block harmful content and restricts the use of features that can lead to addictive use by youths. Last year, Britain adopted a similar online safety law for children.

Domestically, 45 state attorneys general have sued Meta over allegations that it harms young users. Last year, 23 state legislatures adopted child safety laws, and this week New York adopted a law that restricts social media platforms from using recommendation feeds that could lead to compulsive consumption by users under 18.

Many of the parents turned lobbyists cited “The Social Dilemma,” a 2020 documentary about social media harms, as a call to action. They said they were also enraged by revelations in 2021 by the whistle-blower Frances Haugen, a former Facebook employee who testified in Congress that the company knew the dangers for young people on its apps.

“For the first time, I understood that it was the design, it was the companies,” said Christine McComas, 59, who lives in Maryland. She said her daughter Grace died at 15 by suicide in 2012 after being bullied on Twitter.

Many of the parents said the Center for Humane Technology, a nonprofit that advocates social media regulations and was part of the documentary, had connected them after they reached out.

Maurine Molak’s son David died by suicide in 2016 at age 16 after what she said had been cyberbullying on Instagram and messaging apps. Another of her sons found an online memorial page for Grace McComas and encouraged his mother to get in touch with Ms. McComas via email.

The two mothers began having phone calls and connected with other parents, too. Ms. Molak had set up a foundation to educate the public about online bullying and to push for anti-bullying state legislation.

By early 2022, some of the parents had begun working with Fairplay to push for state child safety laws. That February, Senators Richard Blumenthal, Democrat of Connecticut, and Marsha Blackburn, Republican of Tennessee, introduced KOSA.

It had early but modest support, moving out of a Senate committee before stalling for months. Growing impatient, several parents showed up on Capitol Hill that November. Ms. Bride and other parents said they had entered the office of Senator Maria Cantwell, chair of the Commerce Committee and Democrat of Washington, and demanded a meeting. She met with them the next day.

Ms. Cantwell was visibly moved and rubbed the backs of several parents as they talked about their children, Ms. Bride said.

“Having to look at us and to know that our children are no longer with us hits them, and it has gotten people on board,” Ms. Bride said. Ms. Cantwell’s office declined to comment.

Ms. Cantwell became a vocal supporter of the bill, then tried to attach it to a year-end spending bill, which failed.

For much of last year, the bill sat, in part over concerns that the language requiring companies to design sites to protect children was too vague. Some legislators were also concerned that the bill would give attorneys general too much power to police certain content, a potential political weapon.

Discouraged, the parents called one another to stay motivated. In September, Ms. Schmill rented a short-term apartment a 10-minute walk from the Capitol. She changed in and out of sneakers carried in a canvas bag as she visited the offices of nearly all 100 senators to tell them about Becca.

“As I thought about facing another year of her birth date and death date, for me to cope with having to live through another anniversary, I had to feel like I had to be doing something productive in her memory,” Ms. Schmill said.

Late last year, around the time the Senate Judiciary Committee announced a January hearing on child safety with tech chief executives, the parents decided to form ParentsSOS. The initiative, intended to help them gain more support for KOSA, was funded by Fairplay and Ms. Molak’s foundation focused on cyberbullying.

The parents — communicating in emails and texts and over Zoom — decided to go to the child safety hearing to confront the executives from Discord, Meta, Snap, TikTok and X with photos of their children.

At the hearing, Senator Josh Hawley, Republican of Missouri, tried to force Mark Zuckerberg, Meta’s chief executive, to apologize to the parents. Mr. Zuckerberg turned to the parents and said he was “sorry for everything you’ve all gone through.”

Todd Minor, a member of ParentsSOS who was in attendance, said the apology rang hollow. His 12-year-old son, Matthew, died in 2019 after taking part, Mr. Minor said, in a “blackout challenge” on TikTok, in which people choke themselves.

“We need KOSA. It’s that simple,” Mr. Minor, 48, said.

The parents then met with the Senate leader, Chuck Schumer, Democrat of New York, who promised to bring KOSA to a floor vote by June 20, according to Ms. Schmill and others in the meetings.

In April, the House introduced a companion bill.

Ms. Molak, 61, a San Antonio resident, met with Representative Randy Weber, Republican of Texas, last month to talk about her son David.

“Why am I not on this bill? Let’s get on this!” Mr. Weber, a member of the House Energy and Commerce Committee, said to his staff during the meeting, according to Ms. Molak. Mr. Weber’s office did not respond to a request for comment.

But progress in that committee stalled this month. The Senate version of the bill still faces opposition.

Ms. Schmill and three of the other parents trekked back to the Capitol again last week.

“I need to keep busy, to keep trying,” Ms. Schmill said.


If you are having thoughts of suicide, call or text 988 to reach the 988 Suicide and Crisis Lifeline or go to SpeakingOfSuicide.com/resources for a list of additional resources.



Source link

Continue Reading

Sci-Tech

Cyberattack on CDK Global Disrupts Car Sales in U.S. and Canada

Published

on


Thousands of auto dealers across the United States and Canada are suffering disruptions to their operations as a result of cyberattacks on a provider of critical software and data services used in auto retailing.

The provider, CDK Global, said it was targeted in two attacks on Wednesday, prompting the company to shut down its systems to prevent the loss of customer data and to allow testing and other measures to restore its services.

“We are assessing the impact and providing regular updates to our customers,” CDK Global said in a statement. “We remain vigilant in our efforts to reinstate our services and get our dealers back to business as usual as quickly as possible.”

CDK provides services to more than 15,000 retail locations. Its dealer management systems store customer records and automate much of the paperwork and data involved in selling and servicing cars and trucks.

Dealers said the outage had slowed sales and forced them to find alternative methods to produce the titles, contracts, leases, registration cards and other forms that must be delivered to customers, banks and state motor vehicle authorities.

“It is definitely annoying, no doubt,” said Brian Benstock, general manager and vice president of Paragon Honda in the New York borough of Queens. “But we’re still open for business. We’re still selling cars.”

He said his franchise had other systems to retrieve customer data. “We can produce contracts,” he said. “For customers, it’s pretty seamless.”

The disruption has come at a critical time for dealers as they head into the final two weekends of the month, typically a busy time for new-car sales. Many are also preparing for Fourth of July sales and other summer promotions.

Dealers said that in some cases they were reverting to writing contracts by hand, or asking customers to wait a few days to take delivery of their vehicles.

They have less leeway in servicing or repairing vehicles, when customers often expect their cars back within a few hours, but the lack of access to customer data in most cases won’t prevent technicians from performing repair work.



Source link

Continue Reading
Advertisement

Trending

Copyright © 2024 World Daily Info. Powered by Columba Ventures Co. Ltd.