This document discusses the importance of location data and geocoding in the insurance industry. It notes that pricing insurance policies depends heavily on assessing the risks associated with a property's location. It then outlines three key areas to improve geocoding accuracy: input addresses must be clean and structured, the geocoding engine needs to be optimized, and the reference database G-NAF requires better completeness and timeliness in adding new addresses. The overall goal is to achieve a geocoding match rate of over 95% to properly assess location-based risks.
How is the Age of Disruption and geospatial tech & data key to the digital transformation of insurers? How will it allow them to rapidly change their business models, their cultures and the way they use information to enhance the customer experience and remain relevant in the 21st century?
This document discusses how organizations can respond to digital disruption. It notes that rapid technological changes are enabling new business models and customer experiences. To avoid being disrupted, companies must adopt modern work practices like DevOps, automation, collaboration, iterative delivery, and cloud-first strategies. They should also cultivate an innovative culture with traits like empowerment, experimentation, diversity, and questioning assumptions. The document provides examples of disruption in various industries like insurance, transportation, and geospatial services to illustrate these points.
The document summarizes the results of a study of 1,000 US consumers regarding their expectations and needs for 5G connectivity. Some key findings include:
- Connectivity both inside and outside the home is now seen as very important, especially with increased home working due to the pandemic. However, consumers find current network quality inconsistent.
- Most working from home due to the pandemic want to continue doing so and see benefits in flexibility and work-life balance. Others also see benefits if given the option.
- Consumers recognize unequal access to reliable internet based on location and income. They want policies to address this.
- While understanding of 5G is still developing, most expect benefits in areas like value, convenience
This document summarizes Charles Reed Anderson's keynote presentation at the Internet of Things World Asia conference on September 19, 2018. The presentation evaluates the performance of IoT from 2014 to 2018, identifying what technologies and approaches are working ("hot"), not working ("not"), and what may come next ("next"). Some highlights included strong growth of IoT in China, Taiwan's approach to smart cities, lack of end user involvement in IoT projects, and how technologies like eSIMs and small, low-cost sensors may drive future adoption. The presentation concludes that while expectations of IoT may have been too high, opportunities still exist when collaborating across industries and tracking real-world use cases.
The document summarizes the results of a study of 2000 European consumers about their expectations and needs regarding internet connectivity and 5G networks. Some key findings include:
- Reliable internet connectivity has become essential for both work and personal life due to the pandemic. However, consumers find current network quality often falls short, especially during peak usage times.
- Many consumers now work from home and want to continue doing so to balance work and personal life, but better internet connectivity is needed to work effectively remotely.
- Consumers recognize that not all have equal access to reliable internet due to factors like location and income.
- While understanding of 5G capabilities is still developing, nearly half of consumers expect it to deliver
Mobile technology is changing how people interact with each other and brands. Mobile adoption is growing faster than any other technology in history. Near Field Communication (NFC) and mobile payments are expected to see significant growth in the next few years, with NFC-enabled users spending over $48 billion using mobile payments by 2017. New mobile advertising formats and the rise of wearable technology will also transform the mobile landscape in 2014 and beyond.
The State of Smart TV: Automatic Content RecognitionTV[R]EV
TV Data For Our Times: Back when America watched TV via rabbit ears or a 25-channel set top box, a panel-based measurement system based on viewers keeping diaries seemed like a perfectly reasonable way to measure what people were watching on TV.
But it’s 2018, and few people watch TV the old fashioned way anymore. They’re
increasingly watching TV via apps on digital devices and smart TVs. And even if those
apps are owned by traditional networks and MVPDs, the programming that’s on them is not getting counted.
That’s why there’s so much excitement about the data collected via ACR (Automatic Content Recognition) off of smart TVs.
DOWNLOAD THE REPORT HERE: https://mailchi.mp/tvrevolution/tvrev-white-papers
Send any comments/questions to yotvrev@gmail.com
How is the Age of Disruption and geospatial tech & data key to the digital transformation of insurers? How will it allow them to rapidly change their business models, their cultures and the way they use information to enhance the customer experience and remain relevant in the 21st century?
This document discusses how organizations can respond to digital disruption. It notes that rapid technological changes are enabling new business models and customer experiences. To avoid being disrupted, companies must adopt modern work practices like DevOps, automation, collaboration, iterative delivery, and cloud-first strategies. They should also cultivate an innovative culture with traits like empowerment, experimentation, diversity, and questioning assumptions. The document provides examples of disruption in various industries like insurance, transportation, and geospatial services to illustrate these points.
The document summarizes the results of a study of 1,000 US consumers regarding their expectations and needs for 5G connectivity. Some key findings include:
- Connectivity both inside and outside the home is now seen as very important, especially with increased home working due to the pandemic. However, consumers find current network quality inconsistent.
- Most working from home due to the pandemic want to continue doing so and see benefits in flexibility and work-life balance. Others also see benefits if given the option.
- Consumers recognize unequal access to reliable internet based on location and income. They want policies to address this.
- While understanding of 5G is still developing, most expect benefits in areas like value, convenience
This document summarizes Charles Reed Anderson's keynote presentation at the Internet of Things World Asia conference on September 19, 2018. The presentation evaluates the performance of IoT from 2014 to 2018, identifying what technologies and approaches are working ("hot"), not working ("not"), and what may come next ("next"). Some highlights included strong growth of IoT in China, Taiwan's approach to smart cities, lack of end user involvement in IoT projects, and how technologies like eSIMs and small, low-cost sensors may drive future adoption. The presentation concludes that while expectations of IoT may have been too high, opportunities still exist when collaborating across industries and tracking real-world use cases.
The document summarizes the results of a study of 2000 European consumers about their expectations and needs regarding internet connectivity and 5G networks. Some key findings include:
- Reliable internet connectivity has become essential for both work and personal life due to the pandemic. However, consumers find current network quality often falls short, especially during peak usage times.
- Many consumers now work from home and want to continue doing so to balance work and personal life, but better internet connectivity is needed to work effectively remotely.
- Consumers recognize that not all have equal access to reliable internet due to factors like location and income.
- While understanding of 5G capabilities is still developing, nearly half of consumers expect it to deliver
Mobile technology is changing how people interact with each other and brands. Mobile adoption is growing faster than any other technology in history. Near Field Communication (NFC) and mobile payments are expected to see significant growth in the next few years, with NFC-enabled users spending over $48 billion using mobile payments by 2017. New mobile advertising formats and the rise of wearable technology will also transform the mobile landscape in 2014 and beyond.
The State of Smart TV: Automatic Content RecognitionTV[R]EV
TV Data For Our Times: Back when America watched TV via rabbit ears or a 25-channel set top box, a panel-based measurement system based on viewers keeping diaries seemed like a perfectly reasonable way to measure what people were watching on TV.
But it’s 2018, and few people watch TV the old fashioned way anymore. They’re
increasingly watching TV via apps on digital devices and smart TVs. And even if those
apps are owned by traditional networks and MVPDs, the programming that’s on them is not getting counted.
That’s why there’s so much excitement about the data collected via ACR (Automatic Content Recognition) off of smart TVs.
DOWNLOAD THE REPORT HERE: https://mailchi.mp/tvrevolution/tvrev-white-papers
Send any comments/questions to yotvrev@gmail.com
A New Source Of Data For TV Advertisers: The increasing adoption of smart television sets brings new opportunity for TV advertisers.
By 2021, eMarketer projects 114.3 million smart
TVs will be in American homes. And with all
of these new sets comes a wave of data that
will help advertisers create more actionable,
targeted campaigns designed to take
advantage of the new way Americans watch TV.
This newly popular method of capturing viewing
data uses Automatic Content Recognition (ACR)
technology to track what users are watching on
smart TVs, providing second-by-second reporting
that can be used to understand viewing
patterns and to measure the effectiveness of TV
advertising.
In this report we will break down the emerging
smart TV advertising ecosystem, examining how
ACR provides a novel way to capture attention
data and viewing habits. We’ll look at the major
players in ACR advertising and the privacy and
transparency concerns that advertisers need to
be aware of when utilizing this new kind of data.
DOWNLOAD THE REPORT HERE: https://mailchi.mp/tvrevolution/advancedad
The Future of Telecom (Petro Chernyshov Business Stream)IT Arena
Lviv IT Arena is a conference specially designed for programmers, designers, developers, top managers, inverstors, entrepreneurs and startuppers. Annually it takes place at the beginning of October in Lviv at Arena Lviv stadium. In 2016 the conference gathered more than 1800 participants and over 100 speakers from companies like Microsoft, Philips, Twitter, UBER and IBM. More details about the conference at itarena.lviv.ua.
Top Digital Strategic Predictions for 2017 and BeyondDuy, Vo Hoang
Digital business innovation is creating widespread disruption with both direct and secondary effects. By 2020, 100 million consumers will shop using augmented reality, interacting in more virtual ways. Business models will be transformed by mundane technologies like search and blockchain. Secondary effects of digital changes, like changes to transportation from self-driving cars, will often prove more disruptive than initial changes. Digital strategists must consider both direct and indirect consequences of new technologies.
LUMA's Upfront Summit Keynote: "The Future of TV"LUMA Partners
LUMA Partners presents “The Future of TV,” as presented at the Upfront Summit conference on February 4, 2016. This presentation reviews some of the key topics discussed at the conference: the rise of digital video, the traditional TV model, and convergent video.
andrew milroy - top security trends and takeaways for 2013Graeme Wood
The document discusses key security trends for 2013, including the convergence of cloud and mobile technologies, the evolution of the IT security market, and threats arising from social media and consumerization. It notes growing security issues from critical infrastructure becoming internet-connected and the impact of virtual threats spilling into the physical world. The security market is forecast to significantly grow in areas like mobile, virtualization, and DLP. The document concludes that cyberespionage and cyberattacks are becoming more strategic threats as IT becomes embedded in many industries.
This document discusses Innovattia, a mobile technology company that provides end-to-end software solutions including app development, mobile marketing, and payment solutions. It highlights that consumers are spending more time on mobile devices and smart TVs, so companies need to implement multi-device apps. The document promotes Innovattia's services for creating new marketing channels and monetizing brands across different devices and media.
This document summarizes the findings of an insurance awareness study. It collected data on consumers' top-of-mind awareness of auto insurance providers from surveys and compared it to providers' advertising spending and market share data. A marketing mix model was developed that found display advertising to be 9 times more effective than TV at generating awareness. The study showed State Farm was generating higher awareness than expected given its advertising spending, likely due to higher spending on digital display ads than competitors. The document provides recommendations on where insurance providers should focus advertising spending and ways to improve ad creative performance.
Analysis of recent transactions in System Software Industry detailing on Transaction Multiples (Revenue & EBITDA), Multiples Chart, Active Buyers & Transaction Data. A goldmine of resource for Entrepreneurs.
Major events like hurricanes Andrew and Katrina and 9/11 taught insurers to use geographic information systems (GIS) to better understand risks. GIS allows insurers to analyze exposure down to the street level and better model catastrophe loss scenarios. It provides underwriters important information about risks and helps claims departments respond faster to events. While initially used mostly by reinsurers, more insurers now see GIS as strategic for managing catastrophe exposure.
Navigating the maze of google bid adjustment v2Wing Yee Lee
The document discusses various methods for adjusting Google AdWords bids based on different factors like device type, location, time of day, and more. It explains how to calculate bid adjustments between desktop/tablet and mobile using either RPC-based or ROI-based methods. Additionally, it provides tips for adjusting bids based on location, day of week, time of day, and warns about making bids too convoluted with multiple adjustments. The overall goal is to help navigate setting optimal bids on Google ads.
Construction industry leaders and business development experts came together on Jan. 10, 2019, to present a special forum and outlook session for business owners in the construction and real estate arena. This slideshow presentation, presented by Rea & Associates and Overmyer Hall Associates is geared toward helping business owners and industry leaders gain valuable insight into the industry for the year ahead. Doug Houser, director of construction and real estate services with Rea & Associates, presented "Using Your Advisory Relationships To add Value & De-Risk; A Future Outlook. He was followed by Joseph Urquhart, vice president, and Jack Kehl, surety manager, of Overmyer Hall Associates, who presented "2019 Insurance, Risk Management, and Surety Outlook.
To learn more about this presentation, or to inquire about upcoming presentations, contact us at https://www.reacpa.com/contact-us.
This document discusses past, present, and emerging paradigms in risk modeling and pricing for lending. It covers:
1) Generic credit bureau scores have long been used for lending decisions but custom models using lenders' own data can better rank risk. Custom scores are now widely used, incorporating more contract data.
2) An emerging paradigm fixes a target probability of default and solves for loan terms (amount, APR, term) that meet that risk level rather than basing pricing on credit score alone. This allows matching customers to any vehicle, like a peer-to-peer lending platform for autos.
3) The "Big Data" revolution enables capturing more applicant information to further refine risk assessments and
Presented at International Telecoms Week, 24th June 2019, Atlanta. New forecasts from Mobilesquared show RCS growth will accelerate to add 1.15 billion users and achieve scale by end 2020, with hockey stick P2A revenue growth predicted for 2022
Simplifying Risk Management with Accurate Property and Risk DataPrecisely
To determine the likelihood of a loss occurring and the potential impact of that loss, insurers must evaluate and analyze the potential risk an individual or organization may face.
It can be complex to source risk-specific property information to underwrite and price property insurance, and even more complex to manage exposure at the portfolio level or keep up with changes presented by weather, growing communities in urban areas, and population shifts.
During this webinar, you will learn more about:
Harnessing hyper-accurate location and property data to accurately price and underwrite risk Risk Data products that every insurer should focus on, from fire to crime. Enabling dynamic and near real-time weather for efficiency Creating data linkages to understand area risk posed by co-tenancy and adjacency
Leveraging Location Intelligence Data for Efficient Claims and Underwriting O...Precisely
Quick and efficient claims response is the hallmark of creating lifetime customer value. But what if you could improve claims processing and reduce costs at the same time? Getting from FNOL to a closed file and a happy claimant requires moving beyond manual processes, interventions and visual reviews.
The top P&C carriers are showing us how to leverage and integrate a host of new and different data sources, such as internal systems, external sources and streaming information, to create simple and intuitive visualization that delivers insight and context.
Using location data such as dynamic weather and aerial imagery, this visualization accelerates data-driven insights through the democratization of data and analytics throughout an organization. Smart Insurers are using this as a win-win: combining this data with business data to enhance existing claims and underwriting processes, and creating long-term, positive impact on the entire claims experience.
Watch this on-demand webinar to learn how location data will:
- Improve claim assignment efficiency
- Manage claim adjuster workloads
- Manage drivetimes and prioritize claim resource allocations in real time
- RingCentral (RNG) is a provider of cloud-based communications solutions for businesses, including voice, video, and text messaging.
- The analyst recommends buying RNG stock. Recent economic indicators point to a difficult investment environment due to market volatility.
- RNG's cloud-based model reduces costs for customers compared to traditional on-site software, and makes it easier for businesses to support mobile employees across locations.
Data Con LA 2019 - Pitney Bowes methodologies to Organize, Enrich and Analyze...Data Con LA
Location brings an opportunity to gain a competitive edge by unlocking new insights that others have not considered. Why have they not considered location in they past? Because location is messy! Lat/longs, shapes on a map, lines on a map, routes, distance calculations… all of this data is often messy and difficult to bring into an analytics application. Learn how Pitney Bowes and Ironside of applied sophisticated methodologies to Organize, Enrich and Analyze data from a location perspective.
How Transparency Will Shape Location Marketing for the FutureMediaPost
Total ad-fraud losses amount to $42billion in 2019 alone. With so many wasted dollars and new privacy policies being implemented, brands and marketers are starting to rightfully demand more transparency from their location-based advertising data. Independent, third-party companies are improving business outcomes of location-targeted campaigns with trackable, real-time analytics for fraud prevention. In this presentation, Jason Smith outlines the importance of transparency in the location data marketplace and how transparency tools can be utilized to improve marketing in the long-term.
Why the opportunity for RCS is now: RCS Business Messaging market forecasts t...mobilesquared Ltd
Mobilesquared reveals latest RCS business messaging market forecasts, at Mobile World Congress Barcelona, 2019. Includes forecasts to 2023 for consumer engagement frequency by month, business messaging revenues, top 10 mobile operators by business messaging revenues, impact of WhatsApp on mobile operators and more.
Digiday Programmatic Media Summit. Glen Straub. FactualDigiday
This document summarizes key findings from a survey of 369 publishers on navigating mobile with location data and measurement. It finds that while publishers are allocating more resources to mobile and measuring success through engagement, engagement does not correlate with direct sales. Publishers have an opportunity to leverage location data to better measure conversions. The document also examines challenges publishers face in scaling first-party data collection and their use of third-party data partners to overcome these challenges.
They're Not Wrong, They're Just Not RightZeta Global
This document discusses the limitations of traditional attribution approaches and promotes a deterministic approach using personal identifiers. It notes traditional multi-touch attribution (MTA) only measures growth responses, not incrementality. The document advocates for an approach using test and holdout groups to measure incrementality across online and offline channels. It describes Zeta's methodology involving testing channels sequentially and measuring their individual impacts to determine optimal media mix. Finally, it presents Zeta's identity graph and data assets used to power deterministic, omnichannel attribution.
A New Source Of Data For TV Advertisers: The increasing adoption of smart television sets brings new opportunity for TV advertisers.
By 2021, eMarketer projects 114.3 million smart
TVs will be in American homes. And with all
of these new sets comes a wave of data that
will help advertisers create more actionable,
targeted campaigns designed to take
advantage of the new way Americans watch TV.
This newly popular method of capturing viewing
data uses Automatic Content Recognition (ACR)
technology to track what users are watching on
smart TVs, providing second-by-second reporting
that can be used to understand viewing
patterns and to measure the effectiveness of TV
advertising.
In this report we will break down the emerging
smart TV advertising ecosystem, examining how
ACR provides a novel way to capture attention
data and viewing habits. We’ll look at the major
players in ACR advertising and the privacy and
transparency concerns that advertisers need to
be aware of when utilizing this new kind of data.
DOWNLOAD THE REPORT HERE: https://mailchi.mp/tvrevolution/advancedad
The Future of Telecom (Petro Chernyshov Business Stream)IT Arena
Lviv IT Arena is a conference specially designed for programmers, designers, developers, top managers, inverstors, entrepreneurs and startuppers. Annually it takes place at the beginning of October in Lviv at Arena Lviv stadium. In 2016 the conference gathered more than 1800 participants and over 100 speakers from companies like Microsoft, Philips, Twitter, UBER and IBM. More details about the conference at itarena.lviv.ua.
Top Digital Strategic Predictions for 2017 and BeyondDuy, Vo Hoang
Digital business innovation is creating widespread disruption with both direct and secondary effects. By 2020, 100 million consumers will shop using augmented reality, interacting in more virtual ways. Business models will be transformed by mundane technologies like search and blockchain. Secondary effects of digital changes, like changes to transportation from self-driving cars, will often prove more disruptive than initial changes. Digital strategists must consider both direct and indirect consequences of new technologies.
LUMA's Upfront Summit Keynote: "The Future of TV"LUMA Partners
LUMA Partners presents “The Future of TV,” as presented at the Upfront Summit conference on February 4, 2016. This presentation reviews some of the key topics discussed at the conference: the rise of digital video, the traditional TV model, and convergent video.
andrew milroy - top security trends and takeaways for 2013Graeme Wood
The document discusses key security trends for 2013, including the convergence of cloud and mobile technologies, the evolution of the IT security market, and threats arising from social media and consumerization. It notes growing security issues from critical infrastructure becoming internet-connected and the impact of virtual threats spilling into the physical world. The security market is forecast to significantly grow in areas like mobile, virtualization, and DLP. The document concludes that cyberespionage and cyberattacks are becoming more strategic threats as IT becomes embedded in many industries.
This document discusses Innovattia, a mobile technology company that provides end-to-end software solutions including app development, mobile marketing, and payment solutions. It highlights that consumers are spending more time on mobile devices and smart TVs, so companies need to implement multi-device apps. The document promotes Innovattia's services for creating new marketing channels and monetizing brands across different devices and media.
This document summarizes the findings of an insurance awareness study. It collected data on consumers' top-of-mind awareness of auto insurance providers from surveys and compared it to providers' advertising spending and market share data. A marketing mix model was developed that found display advertising to be 9 times more effective than TV at generating awareness. The study showed State Farm was generating higher awareness than expected given its advertising spending, likely due to higher spending on digital display ads than competitors. The document provides recommendations on where insurance providers should focus advertising spending and ways to improve ad creative performance.
Analysis of recent transactions in System Software Industry detailing on Transaction Multiples (Revenue & EBITDA), Multiples Chart, Active Buyers & Transaction Data. A goldmine of resource for Entrepreneurs.
Major events like hurricanes Andrew and Katrina and 9/11 taught insurers to use geographic information systems (GIS) to better understand risks. GIS allows insurers to analyze exposure down to the street level and better model catastrophe loss scenarios. It provides underwriters important information about risks and helps claims departments respond faster to events. While initially used mostly by reinsurers, more insurers now see GIS as strategic for managing catastrophe exposure.
Navigating the maze of google bid adjustment v2Wing Yee Lee
The document discusses various methods for adjusting Google AdWords bids based on different factors like device type, location, time of day, and more. It explains how to calculate bid adjustments between desktop/tablet and mobile using either RPC-based or ROI-based methods. Additionally, it provides tips for adjusting bids based on location, day of week, time of day, and warns about making bids too convoluted with multiple adjustments. The overall goal is to help navigate setting optimal bids on Google ads.
Construction industry leaders and business development experts came together on Jan. 10, 2019, to present a special forum and outlook session for business owners in the construction and real estate arena. This slideshow presentation, presented by Rea & Associates and Overmyer Hall Associates is geared toward helping business owners and industry leaders gain valuable insight into the industry for the year ahead. Doug Houser, director of construction and real estate services with Rea & Associates, presented "Using Your Advisory Relationships To add Value & De-Risk; A Future Outlook. He was followed by Joseph Urquhart, vice president, and Jack Kehl, surety manager, of Overmyer Hall Associates, who presented "2019 Insurance, Risk Management, and Surety Outlook.
To learn more about this presentation, or to inquire about upcoming presentations, contact us at https://www.reacpa.com/contact-us.
This document discusses past, present, and emerging paradigms in risk modeling and pricing for lending. It covers:
1) Generic credit bureau scores have long been used for lending decisions but custom models using lenders' own data can better rank risk. Custom scores are now widely used, incorporating more contract data.
2) An emerging paradigm fixes a target probability of default and solves for loan terms (amount, APR, term) that meet that risk level rather than basing pricing on credit score alone. This allows matching customers to any vehicle, like a peer-to-peer lending platform for autos.
3) The "Big Data" revolution enables capturing more applicant information to further refine risk assessments and
Presented at International Telecoms Week, 24th June 2019, Atlanta. New forecasts from Mobilesquared show RCS growth will accelerate to add 1.15 billion users and achieve scale by end 2020, with hockey stick P2A revenue growth predicted for 2022
Simplifying Risk Management with Accurate Property and Risk DataPrecisely
To determine the likelihood of a loss occurring and the potential impact of that loss, insurers must evaluate and analyze the potential risk an individual or organization may face.
It can be complex to source risk-specific property information to underwrite and price property insurance, and even more complex to manage exposure at the portfolio level or keep up with changes presented by weather, growing communities in urban areas, and population shifts.
During this webinar, you will learn more about:
Harnessing hyper-accurate location and property data to accurately price and underwrite risk Risk Data products that every insurer should focus on, from fire to crime. Enabling dynamic and near real-time weather for efficiency Creating data linkages to understand area risk posed by co-tenancy and adjacency
Leveraging Location Intelligence Data for Efficient Claims and Underwriting O...Precisely
Quick and efficient claims response is the hallmark of creating lifetime customer value. But what if you could improve claims processing and reduce costs at the same time? Getting from FNOL to a closed file and a happy claimant requires moving beyond manual processes, interventions and visual reviews.
The top P&C carriers are showing us how to leverage and integrate a host of new and different data sources, such as internal systems, external sources and streaming information, to create simple and intuitive visualization that delivers insight and context.
Using location data such as dynamic weather and aerial imagery, this visualization accelerates data-driven insights through the democratization of data and analytics throughout an organization. Smart Insurers are using this as a win-win: combining this data with business data to enhance existing claims and underwriting processes, and creating long-term, positive impact on the entire claims experience.
Watch this on-demand webinar to learn how location data will:
- Improve claim assignment efficiency
- Manage claim adjuster workloads
- Manage drivetimes and prioritize claim resource allocations in real time
- RingCentral (RNG) is a provider of cloud-based communications solutions for businesses, including voice, video, and text messaging.
- The analyst recommends buying RNG stock. Recent economic indicators point to a difficult investment environment due to market volatility.
- RNG's cloud-based model reduces costs for customers compared to traditional on-site software, and makes it easier for businesses to support mobile employees across locations.
Data Con LA 2019 - Pitney Bowes methodologies to Organize, Enrich and Analyze...Data Con LA
Location brings an opportunity to gain a competitive edge by unlocking new insights that others have not considered. Why have they not considered location in they past? Because location is messy! Lat/longs, shapes on a map, lines on a map, routes, distance calculations… all of this data is often messy and difficult to bring into an analytics application. Learn how Pitney Bowes and Ironside of applied sophisticated methodologies to Organize, Enrich and Analyze data from a location perspective.
How Transparency Will Shape Location Marketing for the FutureMediaPost
Total ad-fraud losses amount to $42billion in 2019 alone. With so many wasted dollars and new privacy policies being implemented, brands and marketers are starting to rightfully demand more transparency from their location-based advertising data. Independent, third-party companies are improving business outcomes of location-targeted campaigns with trackable, real-time analytics for fraud prevention. In this presentation, Jason Smith outlines the importance of transparency in the location data marketplace and how transparency tools can be utilized to improve marketing in the long-term.
Why the opportunity for RCS is now: RCS Business Messaging market forecasts t...mobilesquared Ltd
Mobilesquared reveals latest RCS business messaging market forecasts, at Mobile World Congress Barcelona, 2019. Includes forecasts to 2023 for consumer engagement frequency by month, business messaging revenues, top 10 mobile operators by business messaging revenues, impact of WhatsApp on mobile operators and more.
Digiday Programmatic Media Summit. Glen Straub. FactualDigiday
This document summarizes key findings from a survey of 369 publishers on navigating mobile with location data and measurement. It finds that while publishers are allocating more resources to mobile and measuring success through engagement, engagement does not correlate with direct sales. Publishers have an opportunity to leverage location data to better measure conversions. The document also examines challenges publishers face in scaling first-party data collection and their use of third-party data partners to overcome these challenges.
They're Not Wrong, They're Just Not RightZeta Global
This document discusses the limitations of traditional attribution approaches and promotes a deterministic approach using personal identifiers. It notes traditional multi-touch attribution (MTA) only measures growth responses, not incrementality. The document advocates for an approach using test and holdout groups to measure incrementality across online and offline channels. It describes Zeta's methodology involving testing channels sequentially and measuring their individual impacts to determine optimal media mix. Finally, it presents Zeta's identity graph and data assets used to power deterministic, omnichannel attribution.
This document provides an overview of Winnebago Industries' leadership team and business segments. It lists the names and titles of the corporate leadership team members. It then provides a high-level forward-looking statement disclaimer and outlines the company's strategic priorities, including strengthening its core RV business and expanding into new profitable markets.
Kidnap and ransom insurance at an inflection pointCognizant
By gathering and distilling meaning from the metadata that exists in the digital world, insurance carriers can mitigate risk for companies that have globe-traveling executives.
This presentation provides an overview of a technology-powered real estate company to investors. It highlights the company's unique employee agent model and map-based search technology. The company has experienced strong revenue growth in recent years across its real estate services, rentals, mortgage, and other segments. It is also gaining market share in the US housing market. The presentation outlines the company's end-to-end product offerings that provide a one-stop shop for customers and its focus on using technology to improve productivity and provide a better customer experience than competitors.
Semelhante a Linking Millions of People Policies and Places (20)
Storytelling is an incredibly valuable tool to share data and information. To get the most impact from stories there are a number of key ingredients. These are based on science and human nature. Using these elements in a story you can deliver information impactfully, ensure action and drive change.
Top 10 Free Accounting and Bookkeeping Apps for Small BusinessesYourLegal Accounting
Maintaining a proper record of your money is important for any business whether it is small or large. It helps you stay one step ahead in the financial race and be aware of your earnings and any tax obligations.
However, managing finances without an entire accounting staff can be challenging for small businesses.
Accounting apps can help with that! They resemble your private money manager.
They organize all of your transactions automatically as soon as you link them to your corporate bank account. Additionally, they are compatible with your phone, allowing you to monitor your finances from anywhere. Cool, right?
Thus, we’ll be looking at several fantastic accounting apps in this blog that will help you develop your business and save time.
Brian Fitzsimmons on the Business Strategy and Content Flywheel of Barstool S...Neil Horowitz
On episode 272 of the Digital and Social Media Sports Podcast, Neil chatted with Brian Fitzsimmons, Director of Licensing and Business Development for Barstool Sports.
What follows is a collection of snippets from the podcast. To hear the full interview and more, check out the podcast on all podcast platforms and at www.dsmsports.net
❼❷⓿❺❻❷❽❷❼❽ Dpboss Matka Result Satta Matka Guessing Satta Fix jodi Kalyan Final ank Satta Matka Dpbos Final ank Satta Matta Matka 143 Kalyan Matka Guessing Final Matka Final ank Today Matka 420 Satta Batta Satta 143 Kalyan Chart Main Bazar Chart vip Matka Guessing Dpboss 143 Guessing Kalyan night
Digital Marketing with a Focus on Sustainabilitysssourabhsharma
Digital Marketing best practices including influencer marketing, content creators, and omnichannel marketing for Sustainable Brands at the Sustainable Cosmetics Summit 2024 in New York
Navigating the world of forex trading can be challenging, especially for beginners. To help you make an informed decision, we have comprehensively compared the best forex brokers in India for 2024. This article, reviewed by Top Forex Brokers Review, will cover featured award winners, the best forex brokers, featured offers, the best copy trading platforms, the best forex brokers for beginners, the best MetaTrader brokers, and recently updated reviews. We will focus on FP Markets, Black Bull, EightCap, IC Markets, and Octa.
Best practices for project execution and deliveryCLIVE MINCHIN
A select set of project management best practices to keep your project on-track, on-cost and aligned to scope. Many firms have don't have the necessary skills, diligence, methods and oversight of their projects; this leads to slippage, higher costs and longer timeframes. Often firms have a history of projects that simply failed to move the needle. These best practices will help your firm avoid these pitfalls but they require fortitude to apply.
HOW TO START UP A COMPANY A STEP-BY-STEP GUIDE.pdf46adnanshahzad
How to Start Up a Company: A Step-by-Step Guide Starting a company is an exciting adventure that combines creativity, strategy, and hard work. It can seem overwhelming at first, but with the right guidance, anyone can transform a great idea into a successful business. Let's dive into how to start up a company, from the initial spark of an idea to securing funding and launching your startup.
Introduction
Have you ever dreamed of turning your innovative idea into a thriving business? Starting a company involves numerous steps and decisions, but don't worry—we're here to help. Whether you're exploring how to start a startup company or wondering how to start up a small business, this guide will walk you through the process, step by step.
𝐔𝐧𝐯𝐞𝐢𝐥 𝐭𝐡𝐞 𝐅𝐮𝐭𝐮𝐫𝐞 𝐨𝐟 𝐄𝐧𝐞𝐫𝐠𝐲 𝐄𝐟𝐟𝐢𝐜𝐢𝐞𝐧𝐜𝐲 𝐰𝐢𝐭𝐡 𝐍𝐄𝐖𝐍𝐓𝐈𝐃𝐄’𝐬 𝐋𝐚𝐭𝐞𝐬𝐭 𝐎𝐟𝐟𝐞𝐫𝐢𝐧𝐠𝐬
Explore the details in our newly released product manual, which showcases NEWNTIDE's advanced heat pump technologies. Delve into our energy-efficient and eco-friendly solutions tailored for diverse global markets.
[To download this presentation, visit:
https://www.oeconsulting.com.sg/training-presentations]
This PowerPoint compilation offers a comprehensive overview of 20 leading innovation management frameworks and methodologies, selected for their broad applicability across various industries and organizational contexts. These frameworks are valuable resources for a wide range of users, including business professionals, educators, and consultants.
Each framework is presented with visually engaging diagrams and templates, ensuring the content is both informative and appealing. While this compilation is thorough, please note that the slides are intended as supplementary resources and may not be sufficient for standalone instructional purposes.
This compilation is ideal for anyone looking to enhance their understanding of innovation management and drive meaningful change within their organization. Whether you aim to improve product development processes, enhance customer experiences, or drive digital transformation, these frameworks offer valuable insights and tools to help you achieve your goals.
INCLUDED FRAMEWORKS/MODELS:
1. Stanford’s Design Thinking
2. IDEO’s Human-Centered Design
3. Strategyzer’s Business Model Innovation
4. Lean Startup Methodology
5. Agile Innovation Framework
6. Doblin’s Ten Types of Innovation
7. McKinsey’s Three Horizons of Growth
8. Customer Journey Map
9. Christensen’s Disruptive Innovation Theory
10. Blue Ocean Strategy
11. Strategyn’s Jobs-To-Be-Done (JTBD) Framework with Job Map
12. Design Sprint Framework
13. The Double Diamond
14. Lean Six Sigma DMAIC
15. TRIZ Problem-Solving Framework
16. Edward de Bono’s Six Thinking Hats
17. Stage-Gate Model
18. Toyota’s Six Steps of Kaizen
19. Microsoft’s Digital Transformation Framework
20. Design for Six Sigma (DFSS)
To download this presentation, visit:
https://www.oeconsulting.com.sg/training-presentations
Taurus Zodiac Sign: Unveiling the Traits, Dates, and Horoscope Insights of th...my Pandit
Dive into the steadfast world of the Taurus Zodiac Sign. Discover the grounded, stable, and logical nature of Taurus individuals, and explore their key personality traits, important dates, and horoscope insights. Learn how the determination and patience of the Taurus sign make them the rock-steady achievers and anchors of the zodiac.
Call8328958814 satta matka Kalyan result satta guessing➑➌➋➑➒➎➑➑➊➍
Satta Matka Kalyan Main Mumbai Fastest Results
Satta Matka ❋ Sattamatka ❋ New Mumbai Ratan Satta Matka ❋ Fast Matka ❋ Milan Market ❋ Kalyan Matka Results ❋ Satta Game ❋ Matka Game ❋ Satta Matka ❋ Kalyan Satta Matka ❋ Mumbai Main ❋ Online Matka Results ❋ Satta Matka Tips ❋ Milan Chart ❋ Satta Matka Boss❋ New Star Day ❋ Satta King ❋ Live Satta Matka Results ❋ Satta Matka Company ❋ Indian Matka ❋ Satta Matka 143❋ Kalyan Night Matka..
Part 2 Deep Dive: Navigating the 2024 Slowdownjeffkluth1
Introduction
The global retail industry has weathered numerous storms, with the financial crisis of 2008 serving as a poignant reminder of the sector's resilience and adaptability. However, as we navigate the complex landscape of 2024, retailers face a unique set of challenges that demand innovative strategies and a fundamental shift in mindset. This white paper contrasts the impact of the 2008 recession on the retail sector with the current headwinds retailers are grappling with, while offering a comprehensive roadmap for success in this new paradigm.
17. to predict claims and claims cost - we need
to determine the risk of something
happening
to do that, we start with each property’s
location and then assess its risk
27. claim frequencyHow often?
modellingWhat could happen?
spatial relationshipsWhat’s nearby?
geocodingL O C A T I O N
G - N A F
calculating location based risk
28. damage curvesHow severe?
claim frequencyHow often?
modellingWhat could happen?
spatial relationshipsWhat’s nearby?
geocodingL O C A T I O N
G - N A F
calculating location based risk
30. DI & geocoding
why? geocoding is the only cost effective method
of locating risks, at the property level, on a
national scale
1:1 pricing - a foundation of our strategy
What do we use?
geocoding
31. DI & geocoding
status? geocoding and geo-pricing
rolled out nationally
geocoding
100% overall geocoding rate
>95% household geocoding rate
34. 3 areas for improvement
1. input addresses
2. address matching
3. reference addresses (G-NAF)
geocoding
35. UNIT
2
LEVEL
3
10-20
ALFRED
STREET
NORTH
NORTH SYDNEY
2060
NSW
need a good
address structure
Street 2/3/10-20 Alfred St N
Suburb Nth Sydney NSW 2060
legacy addresses will
need to be cleansed
input addresses
sub-dwelling type
sub-dwelling no
level type
level no
street number
street name
street type
street suffix
locality
postcode
state
42. is it any good – yes it is
errors – yes, but limited
completeness - ~95% complete
timeliness – can take up to 12 months for
new addresses to be added
G-NAF (reference addresses)
43. errors
mostly transient
range from amusing to business impact
can impact customers
use DIY database rules and QA to
limit the impact
G-NAF
44. errors
examples:
units being 1km from their building address
addresses assigned to the wrong duplicate
locality
alias and principals with diff. cords
G-NAF
47. timeliness
Some new houses
are insurable
before address is in
G-NAF
That’s why G-NAF
Live is encouraging
G-NAF
48. location is fundamental to insurance
geocoding - 3 areas of improvement (to get
>95%)
addresses – clean, structured, well captured
engine – tested, optimised, customised
G-NAF – postcodes, sub-dwellings,
G-NAF Live
summary
Notas do Editor
Hi everyone, thank you very much for coming to this presentation. I’d like to start off with a bit of interactivity!
So, a show of hands please! Who works for an organisation that geocodes their address data?
Very good – now who’s got a property level geocoding rate greater than 95%?
(Excellent, not an easy achievement) OR (Who thinks that’s achievable?)
It’s definitely achievable, our property level geocoding rate is 9n%
Today I’d like to share with you how it can be done, and also touch on how users, vendors, and data custodians, aka the geospatial industry, have the opportunity to enhance addressing & geocoding further.
I’d also like to share with you why geocoding is so important to my organisation.
I’d like to start today’s discussion by giving you some background to insurance & IAGrect Insurance
...before diving into the fundamentals of insurance pricing and location based risk - to give you some context to the core of today’s presentation
Which is primarily about geocoding and addressing...
...and about sharing our experiences implementing a large scale geocoding system using G-NAF.
What is insurance?
According to Warren Buffet on a recent Australian visit – it’s the business of manufacturing promises.
That in return for a purchasing an insurance policy, an insurer will promise to help a customer recover financially in the case of a disaster or accident.
Albeit within the limits of what is covered under that policy.
The promises fulfilled by the Australian insurance industry last financial year equated to $19.7 billion in claims paid to smash repairers, builders, suppliers, and customers.
This was over 1% of GDP - not an insignificant amount.
The division I represent is Direct Insurance (DI); which looks after the NRMA, SGIC and SGIO insurance brands, we also have a joint venture with RACV.
To give you an idea of the scale of IAG’s operations - Last financial year, we insured over 16 million risks. i.e. over 16 million homes, cars, businesses, farms.
We sold almost $9bn worth of policies and insured over $1.5 trillion of personal and commercial property – roughly the same as Australia’s GDP.
Locally, DI contributed to roughly 50% of the group’s business.
To give you an idea of the scale of IAG’s operations - Last financial year, we insured over 16 million risks. i.e. over 16 million homes, cars, businesses, farms.
We sold almost $9bn worth of policies and insured over $1.5 trillion of personal and commercial property – roughly the same as Australia’s GDP.
Locally, DI contributed to roughly 50% of the group’s business.
To give you an idea of the scale of IAG’s operations - Last financial year, we insured over 16 million risks. i.e. over 16 million homes, cars, businesses, farms.
We sold almost $9bn worth of policies and insured over $1.5 trillion of personal and commercial property – roughly the same as Australia’s GDP.
Locally, DI contributed to roughly 50% of the group’s business.
To give you an idea of the scale of IAG’s operations - Last financial year, we insured over 16 million risks. i.e. over 16 million homes, cars, businesses, farms.
We sold almost $9bn worth of policies and insured over $1.5 trillion of personal and commercial property – roughly the same as Australia’s GDP.
Locally, DI contributed to roughly 50% of the group’s business.
Fundamental to our ability to insure millions of risks, and hundreds of billions of dollars of assets, is accurate pricing.
Let’s look at that more closely
There are many factors that go into pricing an insurance policy, such as:
Reinsurance costs (the insurance that protects insurers against catastrophic loss)
Competition within the industry
and Government fees and charges
There are many factors that go into pricing an insurance policy, such as:
Reinsurance costs (the insurance that protects insurers against catastrophic loss)
Competition within the industry
and Government fees and charges
There are many factors that go into pricing an insurance policy, such as:
Reinsurance costs (the insurance that protects insurers against catastrophic loss)
Competition within the industry
and Government fees and charges
But at the heart of a policy’s price is being able to predict how often a customer will need to make a claim and how much it will cost each time
To do that we need to determine the risk of something happening, but where do we start... We start with location.
But at the heart of a policy’s price is being able to predict how often a customer will need to make a claim and how much it will cost each time
To do that we need to determine the risk of something happening, but where do we start... We start with location.
Risk is fundamentally defined by location and it heavily influences the price of an insurance premium.
This is because it defines the risk each property faces at the household level: e.g. whether you live in proximity to a park; or near bushland; or if you live on a main road
It also defines the risk at the suburb level, like your local crime rate;
Or at the regional level - like your earthquake risk
Risk is fundamentally defined by location and it heavily influences the price of an insurance premium.
This is because it defines the risk each property faces at the household level: e.g. whether you live in proximity to a park; or near bushland; or if you live on a main road
It also defines the risk at the suburb level, like your local crime rate;
Or at the regional level - like your earthquake risk
Risk is fundamentally defined by location and it heavily influences the price of an insurance premium.
This is because it defines the risk each property faces at the household level: e.g. whether you live in proximity to a park; or near bushland; or if you live on a main road
It also defines the risk at the suburb level, like your local crime rate;
Or at the regional level - like your earthquake risk
Risk is fundamentally defined by location and it heavily influences the price of an insurance premium.
This is because it defines the risk each property faces at the household level: e.g. whether you live in proximity to a park; or near bushland; or if you live on a main road
It also defines the risk at the suburb level, like your local crime rate;
Or at the regional level - like your earthquake risk
So how do we determine these location based risks for a property?
We start with location by geocoding an address to determine it’s location.
We then look at the spatial relationships between that location and the surrounding area. We look at whether the property is near a park? Is it near bushland? Is it on a main road? For this , we use a variety of datasets, such as NAVTEQ street and POI data
That data is then fed into a statistical model, to confirm which spatial relationships explain the risk of an event occurring, such as a bushfire.
Using claim frequency data we can then determine how often an event might happen – every 5 years, every 10 years, every 50 years?
Now that we’ve determined the types of risks that exist and how often they might occur - we can then apply historical damage data to assess the percentage of damage to a particular type of house would occur.
To be able to do this work, we need a reference addresses set that we can both geocode against, and that we can use for spatial analysis, on a national scale. That dataset is obviously G-NAF.
So how do we determine these location based risks for a property?
We start with location by geocoding an address to determine it’s location.
We then look at the spatial relationships between that location and the surrounding area. We look at whether the property is near a park? Is it near bushland? Is it on a main road? For this , we use a variety of datasets, such as NAVTEQ street and POI data
That data is then fed into a statistical model, to confirm which spatial relationships explain the risk of an event occurring, such as a bushfire.
Using claim frequency data we can then determine how often an event might happen – every 5 years, every 10 years, every 50 years?
Now that we’ve determined the types of risks that exist and how often they might occur - we can then apply historical damage data to assess the percentage of damage to a particular type of house would occur.
To be able to do this work, we need a reference addresses set that we can both geocode against, and that we can use for spatial analysis, on a national scale. That dataset is obviously G-NAF.
So how do we determine these location based risks for a property?
We start with location by geocoding an address to determine it’s location.
We then look at the spatial relationships between that location and the surrounding area. We look at whether the property is near a park? Is it near bushland? Is it on a main road? For this , we use a variety of datasets, such as NAVTEQ street and POI data
That data is then fed into a statistical model, to confirm which spatial relationships explain the risk of an event occurring, such as a bushfire.
Using claim frequency data we can then determine how often an event might happen – every 5 years, every 10 years, every 50 years?
Now that we’ve determined the types of risks that exist and how often they might occur - we can then apply historical damage data to assess the percentage of damage to a particular type of house would occur.
To be able to do this work, we need a reference addresses set that we can both geocode against, and that we can use for spatial analysis, on a national scale. That dataset is obviously G-NAF.
So how do we determine these location based risks for a property?
We start with location by geocoding an address to determine it’s location.
We then look at the spatial relationships between that location and the surrounding area. We look at whether the property is near a park? Is it near bushland? Is it on a main road? For this , we use a variety of datasets, such as NAVTEQ street and POI data
That data is then fed into a statistical model, to confirm which spatial relationships explain the risk of an event occurring, such as a bushfire.
Using claim frequency data we can then determine how often an event might happen – every 5 years, every 10 years, every 50 years?
Now that we’ve determined the types of risks that exist and how often they might occur - we can then apply historical damage data to assess the percentage of damage to a particular type of house would occur.
To be able to do this work, we need a reference addresses set that we can both geocode against, and that we can use for spatial analysis, on a national scale. That dataset is obviously G-NAF.
So how do we determine these location based risks for a property?
We start with location by geocoding an address to determine it’s location.
We then look at the spatial relationships between that location and the surrounding area. We look at whether the property is near a park? Is it near bushland? Is it on a main road? For this , we use a variety of datasets, such as NAVTEQ street and POI data
That data is then fed into a statistical model, to confirm which spatial relationships explain the risk of an event occurring, such as a bushfire.
Using claim frequency data we can then determine how often an event might happen – every 5 years, every 10 years, every 50 years?
Now that we’ve determined the types of risks that exist and how often they might occur - we can then apply historical damage data to assess the percentage of damage to a particular type of house would occur.
To be able to do this work, we need a reference addresses set that we can both geocode against, and that we can use for spatial analysis, on a national scale. That dataset is obviously G-NAF.
So how do we determine these location based risks for a property?
We start with location by geocoding an address to determine it’s location.
We then look at the spatial relationships between that location and the surrounding area. We look at whether the property is near a park? Is it near bushland? Is it on a main road? For this , we use a variety of datasets, such as NAVTEQ street and POI data
That data is then fed into a statistical model, to confirm which spatial relationships explain the risk of an event occurring, such as a bushfire.
Using claim frequency data we can then determine how often an event might happen – every 5 years, every 10 years, every 50 years?
Now that we’ve determined the types of risks that exist and how often they might occur - we can then apply historical damage data to assess the percentage of damage to a particular type of house would occur.
To be able to do this work, we need a reference addresses set that we can both geocode against, and that we can use for spatial analysis, on a national scale. That dataset is obviously G-NAF.
I hope that gives you some context as to the value of geocoding to the insurance industry
Let’s have a look at how we’ve implemented geocoding, and have a look at geocoding and addressing issues in detail
So why does Direct Insurance use geocoding? We use it because it’s the only cost effective method of locating all our customers across Australia, down to the household level.
It is a key part of our 1:1 pricing strategy – to be able to price each customer individually based on their localised risk factors, at the household level.
We’ve implemented Mastersoft’s Harmony Suite, with G-NAF, for address parsing and matching.
We’ve rolled out geocoding and individual customer pricing across several million policies, nationally
Overall we’ve achieved a 100% geocoding rate.
More importantly though, we’ve achieved a 9n% household level match rate
Geocoding at a basic level is simply a process for converting an address into a usable location.
The key to a good geocoding rate is straightforward enough - but it can be difficult or expensive to implement, depending on the volume or structure of your address data
Geocoding at a basic level is simply a process for converting an address into a usable location.
The key to a good geocoding rate is straightforward enough - but it can be difficult or expensive to implement, depending on the volume or structure of your address data
So what prevents you from getting a good geocoding rate?
There are 3 distinct areas where your geocoding rate can be improved, and these are mostly common sense:
1 - The first point is the most obvious one – the quality of your own address data.
2 - The second is the flexibility of your geocoding system to interpret each input address in a multitude of ways to match it to a known address.
3 – Lastly is the quality of the reference addresses used by your geocoding engine. In other words the quality of G-NAF.
Looking at input address issues:
Probably the most common issue is poor address structure. Not having addresses stored in a consistent set of fields.
Another key issue is that you won’t be able to achieve a high geocoding rate without manually or at least semi-automatically cleaning up your addresses. If you’ve been gathering addresses over a long period of time - prior to thinking about using that data as location information - then you may well have a smorgasbord of poorly spelt or downright unintelligible addresses in your database.
And if you have hundreds of thousands of addresses then you will potentially need to employ a team, for well over a year, to cleanup the data – that’s if you want a high geocoding rate.
We have legacy addresses – we have millions of them. In fact we have more addresses on file than there are addresses in Australia! In the past we’ve insured P.O. Boxes!
Fortunately for us – a lot of great work was done before we started on geocoding 3 years ago, which meant we had an excellent set of well structured addresses to start with.
Cleaning up your existing addresses is one thing - how you capture your data will keep your data clean.
There are 3 things that can be implemented at the point of address capture - whether it be via your own web page or through your internal applications - to capture clean, well structured addresses:
1 - Make sure the data is captured using an appropriate set of structured input fields, with rules on those fields. Preferably using a set of fields based on a standard.
2 - Enforce street types, street suffixes and locality and state names using pre-defined pick lists, not freeform text fields with no rules
3 - Use a rapid address tool, such as Harmony, to auto-populate the street and locality information as the user is typing
Localities...!
Local people will sometimes use the local or common name for their area, even though their gazetted locality name is completely different.
This causes a few problems:
Has anyone ever heard of a place called Glenquarie in SW Sydney? Neither has our geocoding engine! Nor G-NAF!
Tamworth is a rural city made up of several suburbs, but everyone says they live in Tamworth. What percentage of our customers in Tamworth do you think give us the wrong locality name? 95% ????
Vanity suburbs could be affecting around 10% of your addresses – they come in 3 main flavours:
1 – The real estate agent told me I live here, so I live here even though it’s the neighbouring suburb
2 – I want to live in the neighbouring affluent suburb so I’ll just use that name
3 – I’ll make one up based on local information
Your geocoding engine and G-NAF has some smarts to correct some of these issues, but not all of them. The solution is to create a lookup table of common and gazetted locality names
Localities...!
Local people will sometimes use the local or common name for their area, even though their gazetted locality name is completely different.
This causes a few problems:
Has anyone ever heard of a place called Glenquarie in SW Sydney? Neither has our geocoding engine! Nor G-NAF!
Tamworth is a rural city made up of several suburbs, but everyone says they live in Tamworth. What percentage of our customers in Tamworth do you think give us the wrong locality name? 95% ????
Vanity suburbs could be affecting around 10% of your addresses – they come in 3 main flavours:
1 – The real estate agent told me I live here, so I live here even though it’s the neighbouring suburb
2 – I want to live in the neighbouring affluent suburb so I’ll just use that name
3 – I’ll make one up based on local information
Your geocoding engine and G-NAF has some smarts to correct some of these issues, but not all of them. The solution is to create a lookup table of common and gazetted locality names
Looking at your geocoding engine and it’s potential limitations...
The key point is to identify the weaknesses in it’s address matching process, and to work around those issues where possible.
Firstly – don’t just accept the default configuration out of the box. Test the system, reconfigure it and test again.
If you really want to stress test the geocoding engine – input the entire raw G-NAF database and see what results you get? You won’t get 100% but you should get high 90’s
Secondly, if you find limitations in the system – add your own custom logic to it.
Lastly, the most obvious one: talk to your vendor: log the bugs and change requests if you want the system to perform better
Looking at your geocoding engine and it’s potential limitations...
The key point is to identify the weaknesses in it’s address matching process, and to work around those issues where possible.
Firstly – don’t just accept the default configuration out of the box. Test the system, reconfigure it and test again.
If you really want to stress test the geocoding engine – input the entire raw G-NAF database and see what results you get? You won’t get 100% but you should get high 90’s
Secondly, if you find limitations in the system – add your own custom logic to it.
Lastly, the most obvious one: talk to your vendor: log the bugs and change requests if you want the system to perform better
In closing – let’s look at some G-NAF issues related to geocoding rates
TIME CHECK
So how good is G-NAF as a reference address dataset?
Good enough to give us a greater than 95% property level match rate, but it’s not perfect... The PSMA are well aware of this and looking into solutions.
There are some errors that creep into the data and there are some significant challenges to make it a more complete reference set of geocoded Australian addresses
On a positive note: resolving the issue of timely G-NAF updates is well and truly underway
Errors are a part of any large dataset with a reasonably complicated schema – and G-NAF is no different
In our experience – these errors are mostly transient things ranging from amusing to having a business impact. They usually aren’t a symptom of wider data quality issues.
The bad news is they can impact a customer. And in our case that could mean their premium goes up or down between policy renewals unexpectedly. So we tightly manage pricing impacts whenever we update G-NAF or re-geocode our customers. This problem can also occur between G-NAF versions when coordinates move significantly for valid reasons.
We haven’t been actively logging bugs with the PSMA due to the competitive nature of insurance, that is now changing – so we owe the product team from PSMA a few emails. But, I couldn’t help listing a few of my favourites from the last 3 years:
Units whose base address was up to a 1km away
Addresses associated to the wrong duplicate locality (Hillgrove Wagga/Armidale) 500km away!
Alias and principal addresses with differing coordinates – that one is a bit more serious and requires our manual intervention to ensure customers weren’t affected.
My recommendation, if geocoding is important, is to – apply database schema rules to G-NAF and do your own QA to ensure any little things that have crept into the data don’t impact your business operations
Errors are a part of any large dataset with a reasonably complicated schema – and G-NAF is no different
In our experience – these errors are mostly transient things ranging from amusing to having a business impact. They usually aren’t a symptom of wider data quality issues.
The bad news is they can impact a customer. And in our case that could mean their premium goes up or down between policy renewals unexpectedly. So we tightly manage pricing impacts whenever we update G-NAF or re-geocode our customers. This problem can also occur between G-NAF versions when coordinates move significantly for valid reasons.
We haven’t been actively logging bugs with the PSMA due to the competitive nature of insurance, that is now changing – so we owe the product team from PSMA a few emails. But, I couldn’t help listing a few of my favourites from the last 3 years:
Units whose base address was up to a 1km away
Addresses associated to the wrong duplicate locality (Hillgrove Wagga/Armidale) 500km away!
Alias and principal addresses with differing coordinates – that one is a bit more serious and requires our manual intervention to ensure customers weren’t affected.
My recommendation, if geocoding is important, is to – apply database schema rules to G-NAF and do your own QA to ensure any little things that have crept into the data don’t impact your business operations
Aside from the obvious candidates of missing reference addresses and addresses without geocodes - there are 2 key issues regarding the completeness of G-NAF:
Postcodes are often viewed as a non-critical part of a structured address. That point of view ignores the fact that the address matching process works best with the maximum amount of information, and postcodes are a valuable piece of information that should be included in the process.
However, currently, postcodes only exist for localities with duplicate names within states. Having a postcode, where applicable, for all G-NAF localities has been on the cards for a little while – but we’d like to see them added as it would not only give a good boost to the geocoding rate but also a reduction in false positives.
Based on analysis of our customer address - there are about 8% of sub-dwellings not in G-NAF with a geocode – these include townhouse developments, retirement villages, blocks of flats as well as permanent sites at caravan parks – this is the number one area where our results to improve.
Also, in large developments having property accurate coordinates, rather than a set of coordinates in the centre of the land parcel, is of great benefit – we’d love to see some more work done in this space as well.
Just because an address is retired in G-NAF doesn’t mean you have to retire it. If you’ve got a good match to a retired address, why drop the geocode? Your data can be considered as another valid source of good address information, so why not treat it as a 4th G-NAF data source.
Aside from the obvious candidates of missing reference addresses and addresses without geocodes - there are 2 key issues regarding the completeness of G-NAF:
Postcodes are often viewed as a non-critical part of a structured address. That point of view ignores the fact that the address matching process works best with the maximum amount of information, and postcodes are a valuable piece of information that should be included in the process.
However, currently, postcodes only exist for localities with duplicate names within states. Having a postcode, where applicable, for all G-NAF localities has been on the cards for a little while – but we’d like to see them added as it would not only give a good boost to the geocoding rate but also a reduction in false positives.
Based on analysis of our customer address - there are about 8% of sub-dwellings not in G-NAF with a geocode – these include townhouse developments, retirement villages, blocks of flats as well as permanent sites at caravan parks – this is the number one area where our results to improve.
Also, in large developments having property accurate coordinates, rather than a set of coordinates in the centre of the land parcel, is of great benefit – we’d love to see some more work done in this space as well.
Just because an address is retired in G-NAF doesn’t mean you have to retire it. If you’ve got a good match to a retired address, why drop the geocode? Your data can be considered as another valid source of good address information, so why not treat it as a 4th G-NAF data source.
Lastly, our take on the near future...
New addresses coming in via the Internet or through our branches and telephone consultants use a real-time geocoding service. The geocoding rate we get from this service drops over time, in between quarterly G-NAF updates. In other words, customers are building houses faster than we can get their reference address into the system.
This problem shouldn’t be around for too much longer – with the introduction of G-NAF Live in the near future we can potentially have a geocoding system that can be updated far more regularly than every quarter – and that will mostly eliminate this problem and allow us to maintain a very high geocoding rate far more easily.
In summary, some key points I’d like you to take away from today
Location information is fundamental to insurance pricing!
Focus on the 3 areas of improvement to improve your geocoding rates, 95% at the property level is achievable
We’re very excited by G-NAF Live