With more sports fans streaming content than ever before, teams and leagues have been forced to change their approach to distribution based on consumer habits. Formula 1 put a newfound emphasis on aggregating data on F1 fans and sports fans alike, using it to improve the experience of the more than 500 million Formula 1 fans worldwide. Global Research Director Matt Roberts is leading the charge, cultivating a new generation of Formula 1 fans using social media, video games and other online content, with a goal of turning Formula 1 into an entertainment brand that extends beyond the races that take place.
Galapagos Reports Promising Phase III Data in Rheumatoid Arthritis
Clara Rodríguez Fernández on 12/09/2018
in a Phase II trial last week.
The inflammation market is huge, expected to reach over €100Bn by 2020. The world’s best-selling drug, Humira (adalimumab) is indicated against multiple inflammatory conditions, including rheumatoid arthritis. By targeting patients that do not respond to biological drugs like Humira, Galapagos could have a competitive advantage in this market. However, it will be facing competition from its previous partner AbbVie.
Before the partnership with Gilead, Galapagos was working with AbbVie in the development of filgotinib. However, the big pharma walked out of their agreement in 2015 and is now developing its own candidate drug for rheumatoid arthritis. AbbVie’s drug candidate, upadacitinib, which has a similar mechanism of action to filgotinib’s, also yielded in April.
Meet these Successful Biotechs at Biospain 2018!
Check Out Europe’s Hottest Pitching Event for Startups!
This content is blocked. Accept cookies to view the content.
Before Kim Slingerland downloaded the Fun Kid Racing app for her then-5-year-old son, Shane, she checked to make sure it was in the family section of the Google Play store and rated as age-appropriate. The game, which lets children race cartoon cars with animal drivers, has been downloaded millions of times.
Until last month, the app also shared users’ data, sometimes including the precise location of devices, with more than a half-dozen advertising and online tracking companies. On Tuesday evening, New Mexico’s attorney general filed a lawsuit claiming that the maker of Fun Kid Racing had violated a federal children’s privacy law through dozens of Android apps that shared children’s data.
“I don’t think it’s right,” said Ms. Slingerland, a mother of three in Alberta, Canada. “I don’t think that’s any of their business, location or anything like that.”
The suit accuses the app maker, Tiny Lab Productions, along with online ad businesses run by Google, Twitter and three other companies, of flouting a law intended to prevent the personal data of children under 13 from falling into the hands of predators, hackers and manipulative marketers. The suit also contends that Google misled consumers by including the apps in the family section of its store.
An analysis by The New York Times found that children’s apps by other developers were also collecting data. The review of 20 children’s apps — 10 each on Google Android and Apple iOS — found examples on both platforms that sent data to tracking companies, potentially violating children’s privacy law; the iOS apps sent less data over all.
These findings are consistent with those published this spring by academic researchers who analyzed nearly 6,000 free children’s Android apps. They reported that more than half of the apps, including those by Tiny Lab, shared details with outside companies in ways that may have violated the law.
Although federal law doesn’t provide many digital privacy protections for adults, there are safeguards for children under 13. The Children’s Online Privacy Protection Act protects them from being improperly tracked, including for advertising purposes. Without explicit, verifiable permission from parents, children’s sites and apps are prohibited from collecting personal details including names, email addresses, geolocation data and tracking codes like “cookies” if they’re used for targeted ads.
But the New Mexico lawsuit and the analyses of children’s apps suggest that some app developers, ad tech companies and app stores are falling short in protecting children’s privacy.
“These sophisticated tech companies are not policing themselves,” the New Mexico attorney general, Hector Balderas, said. “The children of this country ultimately pay the price.”
Jessica Rich, a former consumer protection director at the Federal Trade Commission, called the findings “significant and disturbing.” They suggest, she said, “that the ‘safe spaces’ for kids in the apps stores aren’t safe at all.”
A Google spokesman, Aaron Stein, said that developers are responsible for declaring whether their apps are primarily for children, and that apps in the store’s family section “must comply with more stringent policies.”
A Twitter spokesman said that the company’s ad platform, MoPub, does not allow its services to be used to collect information from children’s apps for targeted advertising and that it suspended the maker of Fun Kid Racing in September of 2017 for violating its policies.
Jonas Abromaitis, founder of the Lithuania-based Tiny Lab, said he believed he had followed the law and Google’s requirements, because the app asked for users’ ages and tracked those who identified as over 13. “We thought we were doing everything the right way,” he said.
A Market for Tracking
Dozens of companies now track consumers on their phones to build behavioral profiles that help tailor the ads they see. Two of the largest are AdMob and MoPub.
To make money, app developers generally have two options: publish free apps supported by ads, or charge users. But children don’t have the money to make purchases, and under federal law they can’t be tracked for ad targeting.
How companies track children’s personal data to target ads
By Anjali Singhvi and Rich Harris | Source:
The app industry has had trouble adapting to children, said Dylan Collins, the chief executive of SuperAwesome, a technology firm that helps companies build apps for children without tracking them.
Mr. Collins said some top children’s app makers had started charging parents for subscriptions or showing ads that didn’t use tracking. But, he noted, small developers typically sell fewer subscriptions and don’t always sell enough ads using only child-friendly ad networks. “As a result, there’s still a huge amount of data being collected on kids,” he said.
In 2013 Apple introduced a children’s section in its App Store. It told developers that, to be listed there, they could “do no tracking across sites or across apps.” Apple tells parents that it reviews each app in the section “to make sure it does what it says it does.”
Google introduced a similar program, Designed for Families, in 2015. The company informed Android developers that apps that were “primarily child-directed must participate” in the program and that developers must confirm that their apps complied with the children’s privacy law. Google has said it developed its family section to help parents find “suitable, trusted, high-quality apps” for their children.
‘For Children’ vs. ‘for Families’
Mr. Abromaitis, the Tiny Lab founder, created Fun Kid Racing in 2013, after searching unsuccessfully for a racing game to play with his 3-year-old nephew. Other Tiny Lab apps include simple games with titles such as Run Cute Little Pony.
Still, Mr. Abromaitis said in an interview, the company’s apps were directed at “mixed audiences,” with children under 13 forming only part of the market.
The distinction is important: Under privacy law, apps aimed at younger children are prohibited from tracking any users for ads without parental consent, but those intended for a general audience can ask players their age and track older users.
When Tiny Lab submitted apps to Google’s store, it indicated they were for families, not just children, and Google accepted the apps.
In The Times’s tests of Fun Kid Racing in July, the app asked that players select their birth year from a list. But with the default set between 2000 and 2001, a young child eager to get to the next screen could simply tap through quickly and be counted as a teenager. In the tests, the app didn’t collect location data if the player identified as under 13.
In early June, emails show, the academic researchers who had done the earlier study informed Google that app developers “seem to have an incentive to mischaracterize” their children’s apps as “not primarily directed to children,” freeing them to track users for targeted ads. They cited 84 apps from Tiny Lab as examples and said they had identified nearly 3,000 apps in all that appeared to be similarly mislabeled.
In July, a Google manager responded that the company had investigated the Tiny Lab apps and had found they had not violated the privacy law. Google, he said, did not consider “these apps to be designed primarily for children, but for families in general.”
A month later, Google appeared to reverse course: The company told Mr. Abromaitis it had identified a Tiny Lab app that should be designated for children. Google gave Tiny Lab a week to change that app and any others like it. Tiny Lab labeled 10 of its apps for children and used ad networks in them designed for children’s apps. Google approved the updates but flagged more apps at the end of August, Mr. Abromaitis said, so he made another round of changes.
Then, this week, after inquiries from The Times, Google terminated Tiny Lab’s account and removed all of its apps from the Play store, citing multiple policy violations.
Asked about the earlier emails, Google said the statements were made in error and that it doesn’t certify whether apps in the Play store comply with the children’s privacy law.
Mr. Abromaitis said he hoped to work with Google to get back into the store.
Widespread Tracking of Children
The study this spring showed not only that more than half of children’s apps on Android were sharing tracking ID numbers but also that 5 percent collected children’s location or contact information without their parents’ permission.
To evaluate tracking on iOS as well as Android, The Times conducted a small study, looking at 10 apps on each platform. The Times chose a mix of the most popular children’s apps and smaller apps that had been flagged in the academics’ research for sharing data, to test whether the apps had problems on iOS and whether they had been fixed on Android.
Although it is difficult to know whether companies are actually violating the federal rules, six of the Android apps shared data such as precise location, IP addresses and tracking IDs in ways that could be problematic. On iOS, five apps sent IDs to tracking companies in questionable ways.
In addition to Fun Kid Racing, the tests showed one other Android app sending precise location data to other companies: Masha and the Bear: Free Animal Games for Kids, an animated game app with millions of downloads. The iOS version sent advertising ID codes to a company that generally prohibits children’s apps from using its network.
In an email, Indigo Kids, the Cyprus-based maker of the Masha app, said it was not responsible for harvesting children’s information because third-party companies collected the data. “We, as a company, do not collect or store any data of our users,” the company said.
Other apps with data practices that could violate the children’s privacy rules sent data to multiple tracking companies that don’t allow children’s apps, or sent the data with notes in the computer code incorrectly indicating that it hadn’t come from children.
Several apps reviewed by The Times also sent the advertising ID to other companies but said this was for specific purposes allowed under the law, such as preventing an ad from being shown too many times.
Tom Neumayr, an Apple spokesman, said that children’s privacy in apps “is something we take very seriously” and that developers must follow strict guidelines about tracking in children’s apps.
Enforcing the Law
Since the federal children’s online privacy law was enacted in 1998, the Federal Trade Commission has brought nearly 30 cases alleging violations by companies including Sony BMG Music Entertainment and Yelp. All of those firms ultimately settled with the agency.
“The F.T.C. has made enforcement of the Children’s Online Privacy Protection Act a high priority,” said Juliana Gruenwald, an agency spokeswoman.
But the New Mexico lawsuit is different. The state is not just going after a single app maker or ad company; it’s also implicating the ad platforms of Google and Twitter, and the vetting process of Google’s app store.
“Google knows that Tiny Lab’s apps track children unlawfully,” the complaint said. “Google’s bad acts are compounded because it represents to parents” that Tiny Lab’s apps comply with the children’s privacy law and are “safe for children.”
The case is particularly fraught for Googleand Twitter, which are each already subject to federal settlements over consumer privacy or security violations. Those settlements prohibit the companies from misrepresenting their consumer data protections, and violations could trigger hefty fines.
“I don’t see any way that anything would change unless there are enforcement actions,” said Serge Egelman, a researcher at the University of California, Berkeley, who helped lead the study this spring.
New Mexico’s attorney general said he hoped the F.T.C. and others in Washington would follow his lead. “This is as much a black eye on the federal government as the tech space,” Mr. Balderas said. “I’m trying to get lawmakers at the federal level to wake up.”
How Game Apps That Captivate Kids Have Been Collecting Their Data
A lawsuit by New Mexico’s attorney general accuses a popular app maker, as well as online ad businesses run by Google and Twitter, of violating children’s privacy law.
By JENNIFER VALENTINO-DeVRIES, NATASHA SINGER, AARON KROLIK and MICHAEL H. KELLER
Figures from labour market analytics data provider Innovantage have shown an increase in online advertising activity for front-line roles across a number of sectors, including finance and insurance. The data analysed job posting trends, exploring original job adverts and reposts of job roles to identify which sectors saw the highest increase in demand for front-line workers.
Cascade method is an important step in the method execution treatment. Your approach is the trick to the success of your service venture. Having a distinct and also described approach before your eyes will certainly become your establishment’s GENERAL PRACTITIONER.
You can determine to conduct your firm from a workplace, or from the ease of your own house. By utilizing our entirely totally free trial, you can observe the way your business might take advantage of the software program which does all the initiative for you! Every business will differ and will certainly require to embrace different approaches for success. If you’re beginning a brand-new company, you ought to develop an organisation program. If you choose to develop a home-based company, contact the zoning department for your city or area, as you might be asked to acquire a home-business license. As many varied businesses do not have the same student marketing practices that Apple has, they still do not benefit from distinctive chances in pupil advertising and marketing.
One reason firms discover it difficult to discover the macro placement right is the elaborate matrix of obligations. Any company can utilize low-priced advertising to promote their product or support. If you’re thinking of how to start a credit rating repair service business, you have actually concerned the ideal place!
The very best Debt Repair service company opportunities among our front end affiliates, you’re going to be liable for assisting clients throughout the credit history repair work enrollment approach. Experiencing the workout ahead up with identities based upon market and also customer research study is fundamental to recognizing your designated customer. Superior credit scores repair service attorney solutions can enable you to understand the credit score treatment as well as also educate you on all you can personally do to buoy your personal credit rating.
The software application has been three-way excellent assurance checked at each degree. Software program established by crawler approaches enables organisations to understand just how well they’re carrying out. Friendly, automated, and also personalized, the computer software program is a device that might increase anybody’s credit remediation. It can manage thousands of clients in a single month. Perhaps you’re also preoccupied to download and install software program as well as manage your own credit history repair.
In the location of software application, consider how close a program satisfies your needs. It also comes with a cost-free mobile app in addition to free upgrades as well as upkeep. As a result, the Strategic Preparation Software pays for itself in the long run. Credit score repair software program substantially reduces on the moment that it takes an individual to navigate the administrative monetary system. Visit this https://www.i-nexus.com/strategy-execution-software site and you will get more information about the benefits of the software.
For the success of any laid out job, it is advised to seek the services of a management professional to examine the minute, budget as well as a range of the job. A management professional is most likely to have the sole goal of making sure that the project complies with the purposes and also objectives of the company. Because circumstances, you might always employ a knowledgeable third-party specialist to handle it upon your part.
While an organization might have sources up front to find up with a brand-new system, it’s critically vital to consider the proceeding upkeep, support, and future development that is necessary for success. Last Ideas When it grows to a specific size, it can not be managed with e-mail and likewise call. Clear in addition to Quick Interaction a number of the world’s most helpful companies have actually differentiated themselves not since they offer the perfect items or they have one of the most reliable advertising.
With the data source, all objectives can possibly be identified to existing easy one-click records for administration to assess the advancement of their tactical strategies. Strategic objectives could potentially be tracked on control panels to supply an aesthetically appealing record of info as well as development in the direction of goals.
Approach implementation fulfils every day working in the company as well as becomes a reduced concern.
Data protection laws in Asia continue to be introduced and updated. One of the most recent developments in South East Asia is in Thailand. On 22 May 2018, the Thai Cabinet approved in principle a revised draft of Thailand’s first personal data protection act (Draft Act). This Draft Act is currently under consideration by the Council of State.
Thailand currently does not have any specific law regulating data protection. The Office of the Prime Minister first published the Draft Act in 2014. The Draft Act has undergone several rounds of changes and this article aims to give a high level overview of the recently approved version of the Draft Act.
The Draft Act has been revised to replicate many of the concepts and obligations which are common across global data protection laws and in particular the GDPR. We have highlighted some of those key obligations below.
The new law has some key definitions which are similar to data protection laws elsewhere:
“Personal data” is broadly defined as information that is able to directly or indirectly identify a living individual.
“Data controller” is a person (whether a natural or legal person) who has authority to make decisions on collection, usage or disclosure of Personal Data.
“Data processor” is a person (whether a natural or legal person) who collects, uses or discloses Personal Data in compliance with the orders of data controller.
The Draft Act regulates both data controllers and data processors, whether or not they are in Thailand, who collect, use or disclose Personal Data collected from individuals in Thailand (whether or not those individuals are Thai citizens). This means that organizations outside of Thailand may be subject to the Draft Act.
Specific consent is required from the data subject, in writing or via electronic means, prior to or at the time of collection, use or disclosure of personal data, unless one of the prescribed exceptions applies. A data subject may at any time revoke his/her consent, unless there is a restriction under the law or contract on revoking such consent.
Collection of personal data
Collection of personal data must be for a lawful purpose and be directly relevant to, and necessary for, the activities of the data controller. The data controller must inform the data subject of the following, prior to or at the time personal data is collected:
the purpose of the collection;
the personal data to be collected;
to whom the personal data might be disclosed;
contact information of the data controller; and
the rights of the data subject.
This information would usually be provided by way of a collection notice.
Except under limited circumstances prescribed under the Draft Act, personal data must be collected directly from the data subject. Also, the collection of sensitive personal data, such as religious belief, political preference, sexual behaviour or medical records, is prohibited except under limited circumstances prescribed under the Draft Act or ministerial regulation. Examples of the permitted circumstances for collection of sensitive data include where sensitive data is collected to protect or prevent harm to a person’s life, body or health, or to comply with any legal requirement on the data controller.
Cross-border transfer of personal data
Personal data can only be transferred to a country with rigorous data protection measures and in accordance with guidelines to be prescribed by the Personal Data Protection Committee, unless:
the transfer is made pursuant to any applicable law;
consent is obtained from the data subject;
the transfer is in compliance with the contract entered into between the data subject and the data controller;
the transfer is in the interests of a data subject who is incapable of giving consent; or
as otherwise prescribed by ministerial regulation.
Rights of data subject
A data subject is entitled to access his/her own personal data which is held by the data controller, or to request the data controller to disclose the sources of information where such personal data is collected without his/her consent. In the event that the data controller fails to comply with any provision of the Draft Act, a data subject is entitled to request the data controller to delete, destroy, temporarily suspend the use of or anonymize personal data.
Fines and penalties
Both civil and criminal penalties can be imposed on the data controller for violation of the provisions of the Draft Act.
The data controller may continue to use personal data collected prior to the date that the Draft Act comes into force, provided that:
such personal data is only used for the purpose for which it was originally collected; and
a mechanism is made available and publicised by the data controller for the data subject easily to request deletion of his/her personal data.
If the council of state approves the Draft Act, the Draft Act will be forwarded to the Thai cabinet and subsequently to the national legislative assembly for approval before coming into force. No official time frame for this process has been announced so it is difficult at this stage to anticipate the enactment date of the Draft Act.
The Draft Act means that companies doing business in Thailand or handling the data of Thai citizens will need to reconsider their policies and procedures for handling personal data in accordance with the new law once passed. Fortunately, it seems that the approach taken under the Draft Act is not inconsistent with many major data protection laws around the world, so companies with a robust data protection regime in place may not have to make too many changes to accommodate the new law.
Buzzwords like “big data” and “artificial intelligence” have become commonplace in many of today’s business conversations, but what exactly do they mean—and what do they mean for the luxury consumer?
The Chinese government’s “Next Generation Artificial Intelligence Development Plan” was established in 2017, making artificial intelligence (AI) a national priority for China. Simply put, today’s high-net-worth buyers still want a traditional tailor-made buying experience, but they want it fast and hassle-free. Artificial intelligence and big data are what will help give them that.
Big data is a term for the huge volume of varied and unstructured data (sounds, images, texts, messages, transactions, videos, social networks, etc.) that companies collect both online and off, from employee counts to weather histories.
Recommended ReadingChina’s Retailers Embrace Big Data RevolutionBy Jing Daily
This data, as you probably know, is now being stored and analyzed by companies so that they can look for useful correlations. Companies create a staggering amount of data—over 2.5 quintillion bytes—every day, and since time is money, these businesses are now hoping to disseminate this huge supply of incoming data in real-time.
Artificial intelligence, meanwhile, is the broad name for computer programming theories and techniques that simulate human intelligence and it’s a field that has seen rapid progress.
Most of the Artificial Intelligence we confront on a daily basis is known as “weak AI”, which means it’s limited to very specific scope: games against a computer, predictive analysis or fraud detection, for example. But progress is continually being made, and though it’s still in its infancy, artificial intelligence could be the key to making our endless piles of big data useful.
What role can AI play In luxury? Businesses now use artificial intelligence to help solve several key issues ranging from inventory management to profitability. In retail—and more specifically luxury—the big demand from AI is greater personalization: personalized communications, recommendations, services, and experiences. No, robots aren’t going to completely replace your charming shop advisers and clerks. But those customer representatives will soon be—if they aren’t already—heavily armed with AI support so that they can better understand and serve you.
Some brands have already created their own chatbots (AI programs which converse with customers via sound or text), and it’s likely you’ve already been chatted up by one. This messaging robot, which is available to answer customer questions instantly and at any time of day or night, also has the “intelligence” to suggest products to customers based on their archived tastes, share their shop addresses closest to the customer, and place orders for them. Burberry, Tommy Hilfiger, and Estée Lauder are just some of the luxury brands that have already adopted this type of AI technology.
But chatbots aren’t the only ways companies can use AI to help facilitate their online retail. Here are two more novel ways companies have applied AI to online retail and luxury industries:
Tech giant Alibaba has changed the online shopping experience of its millions of customers through the use of AI. Thanks to artificial intelligence algorithms, Alibaba’s system learns to continuously recognize the behaviors and purchasing desires of its users. Through this, the e-commerce platform can give customers recommendations based not only on their purchase history, but also on all of their activity on the platform – including searches, comments, and saved articles. Similarly, promotions are tailored to each individual.
Another example, Stitch Fix, is a popular and highly personalized online styling service. Every month, customers receive a box filled with clothes and accessories that Stitch Fix selected for them according to their tastes. Artificial intelligence helps pre-select the contents of these boxes according to company data that relates to these clients. This data can include the preferences of customers with similar tastes, a history of client returns and purchases, and, of course, detailed information about the company’s own products. But Stitch Fix also smartly offers human stylists who only minimally intervene to add a touch of customer service.
Recommended ReadingThese Hyperreal Avatars Could Change the Way Chinese People ShopBy Jessica Rapp
Thorny Questions Around Artificial Intelligence
Artificial intelligence is quickly becoming commonplace in our daily lives, but its prevalence brings up a new set of questions and concerns from online buyers and sellers.
Personal data protection, for instance, is clearly an important issue. While artificial intelligence works through data processing, the laws that regulate that data use have become more explicit and numerous. In China, the Cybersecurity Law (2016) and the National Standard (2018) are the main laws concerning personal data protection. In Europe, the new GDRP (2018) code regulates the processing of personal data, giving customers the right to portability and the right to be forgotten.
But do these new laws put a damper on AI innovation? Or, conversely, are they even strong enough, considering how data as seemingly harmless as a person’s electricity consumption can become a window into their daily habits?
In addition to these concerns, the issue of digital sovereignty is also now being raised by many. Today, most of the services and applications that collect our data are American made (the U.S. dominates the field of R & D on artificial intelligence). But with the growth of online giants like Baidu, Alibaba, Tencent, and Xiaomi, China is becoming a serious threat to the U.S. reign over AI. So when China officially adopted a data localization law last November (purportedly to help fight against cyber-attacks), many saw the move as an action to keep local AI innovation and growth within its borders.
AI Puzzles Retailers Need to Solve
And there are plenty of questions for companies that want to start using AI: How do you differentiate and surprise customers who are already inundated with personalized digital marketing? How can you hyper-customize your offer without creating a sense of intrusion for the consumer? What is the right balance between human and artificial intelligence for customer relationships?
While some see AI as a revolution that will transform our lives across all fields, others see it as a blossoming danger to humanity. Whatever does happen with AI in the future, it’s promising to hear that both companies and their customers are asking tough questions about how and why AI is being used online.
Mazarine Asia Pacific is a digital communication agency specializing in luxury and culture. The agency implements artificial intelligence technology to gather data on real-time market trends