BLOG SPOTLIGHT
Navigating the chaos of data security in the age of GenAI—let’s break down what needs to happen next.
Read more
Popular
Sep 20
0
min
ALTR Welcomes Laura Malins as VP of Product
ALTR continues to strengthen its leadership team, and the latest addition brings a wealth of technical expertise and a fresh perspective to our growing company. We’re thrilled to welcome Laura Malins as the newest member of the ALTR family and VP of Product. With over a decade of experience in data, Laura’s extensive background across industries and technical roles makes her an invaluable asset as we continue to push the boundaries of data security and governance.
From Matillion to ALTR: A Proven Leader in Data Innovation
Laura joins us from Matillion, where she spent the past ten years shaping the future of data transformation. As VP of Product, she ran the Matillion ETL Product and spearheaded the launch of their revolutionary SaaS offering, Data Productivity Cloud. Her ability to understand deeply technical challenges and translate them into user-friendly solutions has earned her recognition as a product leader in the data space.
“I’ve worked with ALTR for a few years now and have always admired the company and the product. Data security platforms are becoming more pertinent than ever, and ALTR’s innovative product is well-positioned to support compliance and security requirements. I’m delighted to join such a strong and ambitious team, and I look forward to taking the product to the next level,” Laura shares.
Laura’s deep technical expertise and user-focused approach will be pivotal in pushing ALTR’s product suite to new heights. Her ability to bridge the gap between complex data challenges and practical, user-friendly solutions aligns seamlessly with our vision of delivering powerful, scalable data access control. With her proven leadership, we anticipate not just product evolution but transformation—bringing enhanced capabilities to our customers while staying ahead of the ever-evolving data security landscape. Laura’s leadership will help us continue empowering businesses to protect their most valuable assets while driving innovation forward.
Sep 19
0
min
Data Security for Generative AI: Where Do We Even Begin?
If you haven’t noticed the wave of Generative AI sweeping across the enterprise hardware and software world, it certainly would have hit you within 5 minutes of attending Big Data London, one of the UK’s leading data, analytics, and AI events. Having attended last year’s show, I can confidently say AI wasn’t nearly as dominant. But now? It’s everywhere, transforming not just this event but countless others. AI has officially taken over!
As a data security focused person, it is exciting and terrifying to see all the buzz. I’m excited because it feels like we’re on the verge of a seismic shift in technology—on par with the rise of the web or the cloud—driven by GenAI. And I get to witness it firsthand! But it is terrifying to see all the applications, solution consultants, database vendors and others selling happy GenAI stories to customers. I could scream into the loud buzz of the show floor, “We have seen this movie before! Don’t let the development of GenAI applications outpace the critical need for data security!” I’m thinking about the rush to web, the rush to mobile, the rush to cloud. All of these previous shifts suffer from the same thing: security is boring and we don’t want to do it. What definitely wasn’t boring was using a groundbreaking mobile app from 1800flowers.com to buy flowers—that was cool! Let’s have more of that! Who cares about security, right? That can wait…
Cyber security, and data security in particular, have had the task of keeping up with the excitement of new applications for decades. The ALTR engineering office is in beautiful Melbourne, FL just a few hours away from Disney. When I see a young mother or father with a concerned look racing after their young child who couldn’t care less that they are about to get run over by a popcorn stand, I think “Application users are the kids, security people are the parent, and GenAI is whichever Disney character the kid can’t wait to hug.” It’s cute, but dangerous. This is what is happening with GenAI and security.
As applications have evolved so has data security. Below is an example of these application evolutions and how security has adapted to cover the new weaknesses of each evolution.
What is Making Generative AI Hard to Secure?
The simple answer is: we don’t fully know. It’s not just that we’re still figuring out how to secure GenAI (spoiler: we haven’t cracked that yet); it’s that we don’t even fully understand how these Large Language Models (LLMs) and GenAI systems truly operate. Even the developers behind these models can’t entirely explain their inner workings. How do you secure something you can’t fully comprehend? The reality is—you can’t.
So, what do we know?
We know two things:
1. Each evolution of applications and data products has been secured by building upon the principles of the previous generation. What has been working well needs to be hardened and expanded.
2. LLMs present two new and very hard problems to solve: data ownership and data access.
Let’s dive into the second part first. To get access to the hardware currently required to train and run LLMs we must use cloud or shared resources. Things like ChatGPT or NVIDA’s DGX cloud. Until these models require less hardware or the hardware magically becomes more available, this truth will hold.
Similar to the early days of the internet, sensitive information was desired to be sent and received on shared internet lines. The internet was great for transmitting public or non-sensitive information, but how could banking and healthcare use public internet lines to send and receive sensitive information? Enter TLS. This is the same problem facing LLMs today.
How can a business (or even a person for that matter) use a public and shared LLM/GenAI system without fear of data exposure? Well, it’s a very challenging. And not a problem that a traditional data security provider can solve. Luckily there are really smart people working on this solution like the folks at Protopia.ai.
So, data ownership is being addressed much like how TLS solved the private-information-flowing-on-public-internet-lines. And that’s a huge step forward. What about data access?
This one is a bit tougher. There are some schools of thought about prompt control and data classification within AI responses. But this feels a lot like CASB all over again, which didn’t exactly hit the mark for SaaS security. In my opinion, until these models can pinpoint exactly where their responses are coming from—essentially, identify the data sets they’ve learned from —and also understand who is asking the questions, we’ll continue to face risks. Only then can we prevent situations where an intern asks questions and gets answers that should only be accessible to the CEO.
Going back to what we know, the first item, we will need to build upon the solid data security foundations that got us to this point in the first place. It has become clear to me that for the next few years, Retrieval-Augmented Generation (RAG) will be how enterprises globally interact with LLMs and GenAI. While this is not a silver bullet, it’s the best shot busineses have to leverage the power of public models while keeping private information safe.
With the adoption of RAG techniques, the core data security pillars that have been bearing the load of a data lake or warehouse to date will need to be braced for extra load.
Data classification and discovery needs to be cheap, fast, and accurate. Businesses must continuously ensure that any information unsuitable for RAG workloads hasn’t slipped into the database from which retrieval occurs. This constant vigilance is crucial to maintaining secure and compliant operations. This is the first step.
The next step is to layer access control and data access monitoring such that the business can easily set the rules for which types of data are allowed to be used by the different models and use cases. Just as service accounts for BI tools need access control, so to do service accounts for the purposes of RAG. On top of these access controls, near-real-time data access logging must be present. As the RAG workloads access the data, these logs are used to inform the business if any access has changed and allows the business to easily comply with internal and external audits proving they are only using approved data sets with public LLMs and GenAI models.
Last step, keep the data secure at rest. The use of LLMs and GenAI will only accelerate the migration of sensitive data into the cloud. These data elements that were once protected on-prem will have to be protected in the cloud as well. But there is a catch. The scale requirements of this data protection will be a new challenge for businesses. You will not be able to point your existing on-prem-based encryption or tokenization solution to a cloud database like Snowflake and expect to get the full value of Snowflake.
When prospects or customers ask me, “What is ALTR’s solution for securing LLMs and GenAI” I used to joke with them and say, “Nothing!” But now I’ve learned the right response, “The same thing we’ve always done to secure your data—just with even more precision and focus for today’s challenges.” The use of LLMs and GenAI is exciting and scary at the same time. One way to reduce the anxiety is to start with a solid foundation of understanding what data you have, how that data is allowed to be used, and whether you prove that the data is safe at rest and in motion.
This does not mean you cannot use ChatGPT. It just means you must realize that you were once that careless child running with arms wide open to Mickey, but now you are the concerned parent. Your teams and company will be eager to dive headfirst into GenAI, but it’s crucial that you can articulate why this journey is complex and how you plan to guide them there safely. It begins with mastering the fundamentals and gradually tackling the tough new challenges that come with this powerful technology.
Sep 9
0
min
ALTR Expands GTM Team with Powerhouse Hires to Lead the Charge in Data Security
ALTR isn’t just keeping pace with the evolving data security landscape—we’re setting the speed limit. As businesses scramble to safeguard their data, ALTR is not just another player in the game; we’re the go-to solution for bulletproof data access control and security. And today, we’re doubling down on that promise with three strategic hires to turbocharge our Go-To-Market (GTM) strategy.
Meet the Heavy Hitters
Christy Baldassarre
Christy Baldassarre joins us as our new Director of Marketing, bringing a formidable blend of strategic vision and execution prowess. With a track record of driving brand growth and market penetration, Christy excels at crafting compelling narratives that resonate with target audiences. She’s a master at turning complex concepts into clear, impactful messaging and knows how to leverage the latest digital marketing tactics to amplify ALTR’s voice.
"I am excited to be on such a great team and to be a part of taking ALTR to the next level. I chose ALTR because of its excellence in Cloud Security and Data Protection. This is a great opportunity to collaborate with such a visionary team and contribute to groundbreaking solutions that not only push boundaries but set new standards of how to keep everyone’s data safe." - Christy
Rick McBride
Rick McBride, our new Demand Gen Manager, brings a deep expertise in go-to-market strategy. With a strong foundation in business development, Rick has honed his skills in identifying opportunities and driving pipeline growth from the ground up. He’s not just about crafting campaigns; Rick knows how to connect with decision-makers and convert interest into action.
“A successful go-to-market strategy thrives on seamless collaboration across various teams, and our GTM group is poised to be the driving force behind it. We're set to champion the Snowflake ecosystem—engaging with customers, Snowflake’s Field Sales team, and partners alike—to fuel strategic growth. By leveraging Snowflake's powerful native capabilities in Security and Governance, we aim to deliver at the speed and scale that Snowflake users expect. We're thrilled to extend this value to every organization that prioritizes and trusts Snowflake for their data management needs!” - Rick
George Policastro
Next, we've got George Policastro as our newest Account Executive. George is a seasoned sales professional with a proven track record of closing complex deals and delivering results. His strengths lie in his ability to deeply understand client needs, build lasting relationships, and strategically navigate the sales process to drive success.
"I’m thrilled to join ALTR and tackle one of the biggest challenges organizations face today: securing their sensitive data while unlocking its full potential to drive business growth." - George
ALTR: Defining the Future of Data Access Control and Security
The world of data security and governance has evolved dramatically from the days of simple perimeter defenses. Now, we’re dealing with sophisticated, multi-layered security strategies that need to keep up with cybercriminals who are more aggressive and resourceful than ever. The core principles—knowing where your data is, who can access it, and ensuring its protection—haven’t changed. However, as data moves to the cloud, the challenge is achieving these goals at an unprecedented scale and speed.
That’s where ALTR excels. We’re not just providing solutions; we’re reimagining what data access control and security can be in a cloud-first world. By cutting through the complexities and inefficiencies of traditional methods, we deliver a streamlined, scalable approach that makes data security both simple and powerful. Our intuitive automated access controls, policy automation, and real-time data observability empower organizations to protect sensitive data at rest, in transit, and in use—effortlessly and at lightning speed. With ALTR, securing your data isn’t just more accessible; it’s smarter, faster, and designed for today’s dynamic cloud environments.
With our latest GTM team expansion, we’re fortifying our foundation to evolve into a cloud data security market leader who’s not just part of the conversation but is driving it.
Sep 3
0
min
Unleashing the Power of FPE: ALTR Key Sharing Meets Snowflake Data Sharing
In a world where data breaches and privacy threats are the norm, safeguarding sensitive information is no longer optional—it's critical. As regulations tighten and privacy concerns soar, our customers are demanding cutting-edge solutions that don't just secure their data but do so with finesse. Enter Format Preserving Encryption (FPE). When paired with ALTR's capability to seamlessly share encryption keys with trusted third parties via platforms like Snowflake's data sharing, FPE becomes a game-changer.
Understanding Format Preserving Encryption (FPE)
Format Preserving Encryption (FPE) is a type of encryption that ensures the encrypted data retains the same format as the original plaintext. For example, if a credit card number is encrypted using FPE, the resulting ciphertext will still appear as a string of digits of the same length. This characteristic makes FPE particularly useful in scenarios where maintaining data format is crucial, such as legacy systems, databases, or applications requiring data in a specific format.
Key Benefits of FPE
Seamless Integration
FPE maintains the data format, allowing easy integration into existing data pipelines without requiring significant changes. This minimizes the impact on business operations and reduces the costs associated with implementing encryption.
Compliance with Regulations
Many regulatory frameworks, such as the GDPR, PCI-DSS, and HIPAA, mandate the protection of sensitive data. FPE helps organizations comply with these regulations by ensuring that data is encrypted to preserve its usability and format, which can sometimes be a requirement in these standards.
Enhanced Data Utility
Unlike traditional encryption methods, FPE allows encrypted data to be used in its existing form for specific operations, such as searches, sorting, and indexing. This ensures organizations can continue to derive value from their data without compromising security.
The Role of Snowflake in Data Sharing
Snowflake is a cloud-based data warehousing platform that allows organizations to store, process, and analyze large volumes of data. One of its differentiating features is data sharing, which enables companies to share live, governed data with other Snowflake accounts in a secure and controlled manner while also shifting the cost of the computing operations of the data over to the share's consumer.
Key Features of Snowflake Data Sharing
Real-Time Data Access
Snowflake's data sharing allows recipients to access shared data in real-time, ensuring they always have the most up-to-date information. This is particularly valuable in scenarios where timely access to data is critical, such as in financial services or healthcare.
Secure Data Exchange
Snowflake's platform is designed with security at its core. Data sharing is governed by robust access controls, ensuring only authorized parties can view or interact with the shared data. This is crucial for maintaining the confidentiality and integrity of sensitive information.
Scalability and Flexibility
Snowflake's architecture allows for easy scalability, enabling organizations to share large volumes of data with multiple parties without compromising performance. Additionally, the platform supports a wide range of data formats and types, making it suitable for diverse use cases.
The Power of Combining FPE with Snowflake’s Key Sharing
When FPE is combined with the ability to share encryption keys via Snowflake's data sharing, it unlocks a new level of security and flexibility for organizations. This combination addresses several critical challenges in data protection and sharing:
Controlled Access to Encrypted Data
By leveraging FPE, organizations can encrypt sensitive data while preserving its format. However, there are scenarios where this encrypted data needs to be shared with trusted third parties, such as partners, auditors, or service providers. Through Snowflake's data sharing and ALTR's FPE Key Sharing, companies can securely share encrypted data along with the corresponding encryption keys. This allows the third party to decrypt the data within the policies that they have defined and use it as needed.
Data Security Across Multiple Environments
In a multi-cloud or hybrid environment, data often needs to be moved between different systems or shared with external entities. Traditional encryption methods can be cumbersome in such scenarios, as they require extensive reconfiguration or critical management efforts. However, with FPE and Snowflake's key sharing, organizations can seamlessly share encrypted data across different environments without compromising security. The encryption keys can be securely shared via Snowflake, ensuring only authorized parties can decrypt and access the data.
Regulatory Compliance and Auditing
Many regulations require organizations to demonstrate that they have implemented appropriate security measures to protect sensitive data. By using FPE, companies can encrypt data that complies with these regulations. At the same time, the ability to share encryption keys through Snowflake ensures that data can be securely shared with auditors or regulators. Additionally, Snowflake's robust logging and auditing capabilities provide a detailed record of who accessed the data and when which is essential for compliance reporting.
Enhanced Collaboration with Partners
In finance, healthcare, and retail industries, collaboration with external partners is often essential. However, sharing sensitive data with these partners presents significant security risks. By combining FPE with ALTR's key sharing, organizations can securely share encrypted data with partners, ensuring that sensitive information is transmitted throughout the data's lifecycle, including across shares. This enables more effective collaboration without compromising data security.
Efficient and Secure Data Processing
Specific data processing tasks, such as data analytics or AI model training, require access to large volumes of data. In scenarios where this data is sensitive, encryption is necessary. However, traditional encryption methods can hinder the efficiency of these tasks due to the need for decryption before processing. With FPE, the data can remain encrypted during processing, while ALTR's key sharing allows the consumer to decrypt data only when absolutely necessary. This ensures that data processing is both secure and efficient.
Use Cases of FPE with ALTR Key Sharing
To better understand the value of combining FPE with ALTR's key sharing, let's explore a few use cases:
Financial Services
In the financial sector, organizations handle a vast amount of sensitive data, including customer information, transaction details, and credit card numbers. FPE can encrypt this data while preserving its format, ensuring it can still be used in legacy systems and applications. Through Snowflake's data sharing, financial institutions can securely share encrypted transaction data with external auditors, partners, or regulators, along with the necessary encryption keys. This ensures compliance with regulations while maintaining the security of sensitive information.
Healthcare
Healthcare organizations often need to share patient data with external entities, such as insurance companies or research institutions. FPE can encrypt patient records, ensuring they remain secure while preserving the format required for healthcare applications. Snowflake's data sharing allows healthcare providers to securely share this encrypted data with third parties. At the same time, ALTR enables the sharing of the corresponding encryption keys, enabling them to access and use the data while ensuring compliance with HIPAA and other regulations.
Retail
Retailers often need to share customer data with marketing partners, payment processors, or logistics providers. FPE can be used to encrypt customer information, such as names, addresses, and payment details while maintaining the format required for retail systems. Snowflake's data sharing enables retailers to securely share this encrypted data with their partners; with ALTR, the encryption keys are also shared, ensuring that customer information is always protected.
The Broader Implications for Businesses
The combination of Format Preserving Encryption and ALTR's key-sharing capabilities represents a significant advancement in the field of data security. This approach addresses several critical challenges in data protection and sharing by enabling organizations to securely share encrypted data with trusted third parties.
Strengthening Trust and Collaboration
In an increasingly interconnected world, businesses must collaborate with external partners and share data to remain competitive. However, this collaboration often comes with significant security risks. By leveraging FPE and ALTR's key sharing, organizations can strengthen trust with their partners by ensuring that sensitive data is always protected, even when shared. This leads to more effective and secure collaboration, ultimately driving business success.
Reducing the Risk of Data Breaches
Data breaches, including financial losses, reputational damage, and regulatory penalties, can devastate businesses. Organizations can significantly reduce the risk of data breaches by encrypting sensitive data with FPE and securely sharing it via Snowflake. Even if the data is intercepted, it remains protected, as only authorized parties with the corresponding encryption keys can decrypt it.
Enabling Innovation While Ensuring Security
As organizations continue to innovate and leverage new technologies, such as artificial intelligence and machine learning, the need for secure data sharing will only grow. The combination of FPE and ALTR's key sharing enables businesses to securely share and process data innovatively without compromising security. This ensures that organizations can continue to innovate while protecting their most valuable asset – their data.
Wrapping Up
Integrating Format Preserving Encryption with ALTR's key sharing capabilities offers a powerful solution for organizations seeking to protect sensitive data while enabling secure collaboration and innovation. By preserving the format of encrypted data and allowing for secure key sharing, this approach addresses critical challenges in data protection, regulatory compliance, and data sharing across multiple environments. As businesses navigate the complexities of the digital age, the value of this combined solution will only become more apparent, making it a vital component of any robust data security strategy.
ALTR's Format-preserving Encryption is now available on Snowflake Marketplace.
Browse All
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
Sep 13
0
min
Crushing the Data Privacy and Security Problem
ALTR Blog
Way out over our skis
As a society we’ve been barreling down the path to being completely “data-driven” for the last ten to 20 years. We’ve produced and collected massive amounts of data and rapidly built up an ecosystem of technological conveniences on that data foundation—not thinking about the new risks that brings. But looming over the horizon are data privacy and security challenges so large that if we don’t solve them, we could actually regress from a technological perspective. All the conveniences we take for granted require that those hoards of data stay safe and secure. If we don’t halt those hazards in their tracks, they’ll only get worse, until the only choice is to lock down data, putting all the progress we’ve made at risk. In other words, we’re way out over our skis right now.
At ALTR, we’re committed to doing whatever it takes to overcome this challenge. Our mission is to solve for the root problems of data privacy and security and give everyone the tools to crush them.
Uniquely positioned to democratize data governance
We don’t believe high costs, long implementations, and big resource requirements should stop companies from controlling and protecting their data. And unlike many other companies in the data control and protection space, we’re not a legacy technology built for the datacenter and awkwardly, bulkily, expensively transitioned to the cloud. From our beginning, we saw the potential and power of building on the cloud, and as a cloud-native SaaS provider, we don’t have the overhead expenses others do. This means we can utilize advanced cloud technologies to solve these difficult problems and provide solutions to customers on cloud data platforms easily, quickly and at a very low cost.
And that’s what we’re doing. Today, we’re announcing three new plans that deliver simple, complete control over data. This includes our Free plan—the first and only in the market—which gives companies powerful data control and protection for free, for life, starting on Snowflake. Just like Snowflake is democratizing cloud data access, ALTR is democratizing cloud data governance. We’re freeing data governance so that everyone can control and protect their data.
Powerful cloud-based data control and protection that matures with your Snowflake journey
One of the reasons companies struggle to tackle the data privacy and security challenge is that they leave understanding who is consuming what data and why until later. With other solutions, it’s too expensive or too resource-intensive, so instead of adopting data governance early, as a preventative action, companies wait to understand and manage data usage until it becomes required to comply with regulations, or worse, they have a privacy or security incident.
ALTR’s Free plan clears away those objections, supporting companies’ initial forays into Snowflake by enabling easy yet powerful data control and protection from the start, with no cost and no commitment. Our new data usage analytics capability shows who your top Snowflake users are and gives clear visibility into what and how much data they’re consuming. This allows you to better understand what normal is, create controls based on necessary usage, and quickly identify and investigate anomalies. At a high level, this intelligence will help you assess the value of your Snowflake project, plan your future roadmap, and put you in a better position to solve problems that might arise later.
The ALTR Free plan is available here on our website or now as the first complete data control and protection solution via Snowflake Partner Connect.
As you mature in your cloud data platform journey, better understand your use cases and start thinking about adding sensitive data, you may want to upgrade to ALTR Enterprise for expanded access controls, compliance support and enterprise integrations. Once you’re more advanced and have migrated all your sensitive data, you may be stuck manually implementing governance features which can make the cost of ownership and maintenance very high. ALTR Enterprise Plus can help you can automate to save time and scale more easily, utilize powerful data security features like tokenization, and get the support of our experienced customer success team.
Ending the data privacy and security problem
Data governance tools that are too costly, require too much time and too many resources to implement and maintain are crippling our ability to take on the big data privacy and security problems we know we must in order to continue our pace of technological advancement. Leaving data at risk is just not sustainable for anyone. ALTR's release today is the next step in our journey to solve these problems and deliver formidable solutions that are easy to use and easy to buy – so everyone can control and protect their data, wherever they are on their data journey.
Get started for free today.
Dec 13
0
min
Are You Ready When the Cloud Turns into a Downpour?
ALTR Blog
I think it’s safe to say that everyone in the technology industry was shocked by last week’s Amazon Web Services outage. As one of the major backbones of cloud-based Internet services, Amazon’s issues affected everything from Disney+, Netflix and Roku streaming to services such as Venmo and CashApp to the company’s own delivery drivers. Although this was unprecedented for Amazon, it was definitely a wakeup call to anyone who relies on cloud hosting services. And it underscored the need for true SaaS-based, multi-region “high availability” solutions to support resiliency in modern data architectures.
Between a blip and a disaster
According to Amazon, the issue originated in its US-East-1 region in Virginia. Of course, Amazon’s services include high availability within its East, West and other global regions, so if there’s an issue in one datacenter or zone within a single region, workloads can be routed to different locations within the same region to ensure uptime. This incident was so shocking because the number of core services affected in a single region increased the blast radius so significantly.
Yet, it might not qualify as a true “disaster” as defined in most disaster recovery planning. Those are natural events (hurricanes, tornadoes, tsunamis) or man-made (accidental/intentional, terrorism, hacking) that cause a long-term disruption to the business. In this case, there was no natural event or obvious intentional sabotage that would clearly impact availability longer term. While companies could have implemented their disaster recovery plans, doing so would have come with costs: potential loss of a limited amount of data, absorption of employee resources to implement the plan, the time necessary to reverse the changes once the incident had passed, if it wasn’t permanent. There was no way to know how temporary or long lasting this incident might be so executing a disaster recovery plan could have placed an additional and unnecessary burden on the business.
That still leaves a gap between normal operations and full-on disaster recovery – a gap that can cause major issues. Many software companies today pride themselves on the “five nines” – 99.999% uptime in a given year. Some even include that SLA in their contracts. Even though this incident lasted less than a day, it was potentially enough to drop affected companies’ uptime for 2021 from five nines to three: 99.9%. They lost two orders of magnitude in one incident.
In the critical path of data
For modern enterprises who rely on data to run their internal operations, uptime and availability are just as important, and in the modern data architecture, the entire system is only as reliable as its weakest link. This is especially critical right now for industries like banking where the movement of money, and in fact the entire system, relies on the flow of data. But this will become increasingly important across all industries as data becomes more and more essential to core business functions.
That means when you’re picking your database, your ETL provider, and your BI tools, availability needs to be a key “non-functional” factor you evaluate as part of your buying process along with any features you need. When it comes to data control and security solutions, ALTR’s availability is unmatched. Because we interact with and follow data from on-prem through the ETL process to the cloud, we’re in the critical path of data, making it crucial that our service keep up with the rest of the data ecosystem. Our answer is multi-region, high availability built into normal operations via a true multi-tenant SaaS solution. Because SaaS allows economies of scale, we don’t have to build and maintain dedicated single-customer infrastructure in multiple geographic regions. We just have to construct ALTR infrastructure in multiple geographies for all our customers to leverage, significantly reducing cost and complexity.
Disruption doesn’t have to slow you down
This incident highlighted a weakness in resiliency planning: what happens when something less than a disaster causes a significant disruption to the business? We don’t think the uptime guarantees many have trusted to date are strong enough for today’s business environment, especially for business-critical data infrastructure. The future is high availability based on multi-region redundancy – we expect this to be a broad and consistent theme across industries. And for essential data control and security, that means ALTR.
Aug 18
0
min
Moving to the Cloud Doesn't Have to Be Daunting for Small and Mid-size Financial Institutions
ALTR Blog
Cloud computing has disrupted just about every single industry since its inception in the mid-2000s, and financial services are no exception. According to a 2020 Accenture report, banking is just behind industries like ecommerce/retail, high tech electronics, and pharma/life sciences on the cloud adoption maturity curve.
In fact, the average bank has 58 percent of its workloads in the cloud, but the majority run on private rather than public cloud which limits the cost-savings and benefits. The emphasis on private cloud may be due to the unique challenges the banking industry faces, including meeting stringent regulatory requirements on infrastructure they don’t own. And for banks that have made no move to the cloud, investments in legacy systems and the difficulty transitioning to the cloud may be an additional roadblock.
For smaller and midsize banks in the sub $10B asset class, the decision to move the cloud can be even trickier. They may feel pressure from their boards to adopt this powerful new technology while also being strongly reminded to do so safely! Regional banks often build their reputations on the trust of local communities which makes any change that could damage that an enormous risk. At the same time, they may be competing with larger players that have adopted the cloud at a rapid pace to provide cutting-edge services to consumers. Capital One became the first major bank to go cloud-only when it closed its own datacenters entirely and moved all operations to AWS public cloud. Accenture found that moving swiftly to the cloud is paying off for banking “cloud leaders”—they’re growing revenue twice as fast as the “laggards”.
So, it’s clear that the move to the cloud is coming for even smaller banks, but where to start? We suggest taking a look at your enterprise data warehouse.
Take the first step into the cloud with your enterprise data warehouse
For banks that need to minimize risk, are unsure how to make the move safely and properly, may not have a CISO or even dedicated security team, and whose IT teams that are focused on managing their own iron in their own datacenters, taking the first step to the cloud can seem like a heavy lift. It doesn’t have to be.
You can start small with your enterprise data warehouse. This is often a SQL server in your own datacenter where you collect a daily or weekly data dump from your core systems. It may contain some sensitive data that’s accessed by various groups around the company via business intelligence tools like PowerBI, Qlik or Tableau installed on user desktops. Marketing for example, might run zip code reports on deposits in order to make targeted offers on mortgages or car loans.
The key is that the data is already consolidated and it’s not core software – that reduces both the complexity and the risk. Moving this workload to Snowflake with ALTR data governance and security is fairly straightforward and provides several interesting advantages for small and mid-size financial institutions:
- Reduced costs: Eliminate the expense of maintaining the datacenter server infrastructure required for an on-prem data warehouse. With Snowflake, you pay for storage separately from compute, minimizing costs.
- Increased reliability and scalability with enterprise-level security: Snowflake’s unique architecture allows virtually all your users and data workloads to access a single copy of your data without impacting performance. In addition, Snowflake and ALTR offer SOC 2 Type 2, PCI DSS compliance, and support for HIPAA compliance.
- Expanded BI access: You can standardize on a SaaS business intelligence tool you don’t have to install or maintain. This allows you to provide easy access to more users across company to make best use of your data.
- Enriched data with third party datasets: Snowflake Data Marketplace supplies more than 500 live and ready-to-query data sets from more than 140 third-party data and data service providers to enhance your own datasets and provide deeper, actionable insights for your organization.
- Integrated cloud-native data governance and security: With ALTR, you can discover and classify sensitive data as it’s moved to the cloud then instantly and automatically restrict access based on those classifications. You also get complete observability and protection over how data is consumed regardless of access point.
Snowflake + ALTR = your secure cloud data warehouse-in-a-box
Migrating your enterprise data warehouse workload to Snowflake with ALTR essentially gives you a “secure cloud data warehouse-in-a-box”. You can take on this discreet pilot project, with minimal investment of time, resources, cost and risk, and learn how the cloud works best for your financial institution. This allows you to start operationalizing your process for moving additional workloads to the cloud safely.
In addition to the experience gained, there can be measurable results. Accenture estimates that when financial institutions move data warehousing and reporting workloads to the cloud they could see a 20-60% reduction in costs, increased operational efficiencies, improved real-time data availability, and better data governance and lineage.
Not bad for your first adventure into the cloud.
Ready to start your secure cloud data warehouse journey? Let us show you how easy it is to get started. Get a demo today.
Mar 16
0
min
Put your Cloud Data Migration on Rails
ALTR Blog
Everybody is doing it: the cloud data migration. Whether you call your project “data re-platform”, “data modernization”, “cloud data warehouse adoption,” "moving data to the cloud” or any of the other hot buzz phrases, the idea is the same: move data from multiple on-prem and SaaS-based systems and data storage into a centralized cloud data warehouse where you can use that data to spend less money or make more money. In other words, the goal of consolidating into a cloud data warehouse is, at its core, to save company money on cost of goods sold or grow company revenue streams.
What’s not to love? But one of the tradeoffs is that, as you go through the process, you end up losing the full visibility and control you had over data in your on-prem systems leading to cloud migration risks and cloud data migration challenges you might not expect.
We see this concern about visibility and control come up over and over, at every stage of the cloud data migration journey – from CIOS, CISOs and CDOs who are accountable for making sure data stays secure, to the leaders who must address this in the overall project within the given budget, until it finally lands on the Data Engineers and DBAs who must decide how they’ll fix this and pick the actual solution. Here's how to mitigate cloud migration risks...
Overcome Cloud Data Migration Challenges
Putting your cloud data migration project on rails out of the gate means you can move more quickly, more securely. Even if you’re going 100 MPH with data, you can be sure you’re not going to:
- Fly off the path
- Put your data or company at risk
- Damage your reputation
How do you do that? No matter if you’re using an ETL or Snowpipe to transfer data, choosing a cloud-hosted BI (Business Intelligence) tool or an on-prem solution to run analytics, doing it all with visibility, control, and protection in place from the very moment you begin will mean that your continuous data path from on-prem to the cloud will be safe, speedy and secure.
And those pesky cloud data migration challenges that could pop up at each stage won’t slow you down or derail the project:
Selecting a Cloud Data Warehouse:
Whether it’s Snowflake, Amazon Redshift, Google, you’ll need to answer these to get your cloud data migration project off the ground…
- How do I control user access to the data?
- What level of audit logging do I have? And what do I need?
- How can I integrate this with my existing security stack (SSO, MFA, Splunk)?
- If data cannot be in clear text, is encryption okay or will I need a tokenization solution?
Selecting a BI tool:
Again, whether you’re looking at Tableau, Qlik, PowerBI or other, similar operational questions will come up…
- Can I give different users using a shared service account varying levels of access?
- If I wanted to do tokenization, where in the stack is the best place to implement it? Should it be in the BI tool or elsewhere?
- How can I see what each user is doing through a single service account?
- How do I know what sensitive data is being accessed in the reports? How can I control that data?
Choosing an ETL:
When moving the actual data from data sources to the new cloud data warehouse, if these questions don’t come up, they should…
- When sensitive data is moving from on prem to the cloud, how can I protect it at rest?
- Where should I implement data security in the cloud migration journey?
- When I protect all my data, how can I keep it useful? How do I keep reports from breaking?
- When I move data into the cloud, how can I be sure it won’t be stolen?
Implementing a Data Catalog:
Data “brains”, like Collibra, OneTrust, and Alation, have all the information about users, data itself, data classifications and the policies that should be placed on user data access. What they don’t have is a way to operationalize that policy.
- Manually approving and applying access is a time sink. Is there anything to automate this?
- I have no way of controlling the admins of these systems. Do they have to have this much access to my data?
- I can only apply masking policies manually with support tickets. Why?
If you can be prepared to answer these questions from the start, you can avoid cloud migration risks, overcome cloud data migration challenges, and the good news will travel back up the line very, very quickly to the executives who need to ensure that data visibility, control and protection are covered.
ALTR is here to help you solve the problem at a very low cost, with a low-friction implementation, and a short time to value. You can start with ALTR’s free plan today and upgrade when you have too many users and too much data to govern at scale without an automated data control and policy enforcement solution like ALTR.
Put your cloud data migration on rails with ALTR!
Oct 7
0
min
CISOs: 100% Responsible for Cloud Data Security with 0% Control
ALTR Blog
Since Salesforce launched at the end of last century, the cloud application boom has been unstoppable. Along with that has come another boom: cloud-hosted data. The rise of digital transformation, as well as other trends like mobile and IoT, has led to a massive increase in the amount of data created. In fact, 64.2 Zettabytes of data was created or replicated in 2020 according to IDC. That’s 10^21 bytes of data!
Now, all that data represents a rich resource of knowledge to business – from where consumers visit online to how companies make purchases. And the best way to get value from it is to consolidate the multitude of data points and put machine learning, AI or Big Data tools on top of it to connect the dots. This data analysis can either be done in an on-premises data warehouse or in the cloud. Doing it in the cloud delivers some compelling benefits including virtually unlimited scalability with no costs for infrastructure investment and lower ongoing maintenance. The attractiveness of the cloud data warehouse model is one of the reasons Snowflake debuted with the biggest software IPO ever in 2020.
But consolidating all this data, especially sensitive data, into the cloud creates a serious challenge for Chief Information Security Officers (CISOs): how can they be 100% responsible for data security when they have 0% control over the infrastructure where it’s stored?
The cloud data accountability/control mismatch
CISOs and their security teams had their roles nailed down: secure the datacenters with firewalls, stop employees from clicking on phishing emails or accessing malware infected websites, and protect the company perimeter from hackers and outside threats. These were tactics meant to deliver specific and important end results: keep the network safe and protect company data. Forrester Research calls this “Zero-Trust”, but it’s a perimeter defense mechanism that does not apply to the “perimeter-less” cloud.
But today, a Chief Marketing Officer (CMO) may look at the rich data streams moving throughout the company, generated by 15 or 20 different applications, with hundreds of data points about customers and prospects, and make the argument that if only that data were combined, it could deliver a minutely-detailed composite of individual users and buyers – and marketing could raise revenue by 8%.
The CMO gets the go ahead to move that data to Snowflake, but where does that leave the CISO? Suddenly, the data is in an environment he or she doesn’t control. Increasingly the business project is taking a much higher priority and security is trying to catch up. The CISO is still responsible for securing data that’s been moved outside the nice, cozy, protected perimeter the security team has spent years perfecting. If there’s a data breach, they’re still on the hook, they could still get fired, but how can they stop that if they don’t control the space?
The CISO is still responsible, even when data leaves home
Think of it like a parent who lets their children stay overnight at a friend’s house. The parent is still responsible for the child’s safety, so shouldn’t they ask the friend’s parents some questions? Find out about the culture of the home? Who the parents’ friends are? What kind of rules they impose? The parent doesn’t stop being responsible or stop worrying once their child leaves the home. And they certainly don’t lock their children up at home in order to “keep them safe” – that’s not reasonable.
Some CISOs and Chief Risk Officers try to maintain control by placing stringent rules around how the data can be stored and used in cloud data warehouses. I’m aware of one that requires sensitive data to be stored on Snowflake only when encrypted or tokenized. In order to be used or operated on, it has to be moved into a secure on-prem environment the CISO controls, de-crypted/de-tokenized, utilized, then encrypted or tokenized before being transferred back to Snowflake.
It may be secure, but it’s like making your child come home to ask permission before playing a game or having a snack at the sleepover. It’s really clunky and slows things down. Some security execs are jumping through a lot of hoops to overcome this accountability and control mismatch.
Others are just abdicating control and trusting cloud data warehouse providers. This leaves a hole in security: these providers have taken over responsibility for maintaining the infrastructure, the perimeter, the physical space, but they’re not taking on the responsibility of user identity and access or the data itself – that still resides with the company, especially the CISO. To be clear, Snowflake is very secure, but the more successful they become the more a target they are for bad actors and especially nations-states.
Moving beyond perimeter-centric to data-centric security
This shift to the cloud really requires a shift in the security mindset: from perimeter-centric to data-centric security. It means CISOs and security teams need to stop thinking about hardware, datacenters, perimeters and start focusing on the end goal: protecting the data itself. They need to embrace data governance and security policies around data. They need to understand who should have access to the data, understand how data is used, and place controls and protections around data access. They should look for a combined data governance and security solution that delivers complete data control and protection.
Because bad actors don’t care who’s responsible—they’re going where the data is and taking advantage of any holes they find. The 2021 Verizon Data Breach Investigations Report (DBIR) showed this clearly: this year 73 percent of the cybersecurity incidents involved external cloud assets. This is a complete flip-flop from 2019, when cloud assets were only involved in 27 percent of breaches.
Regulators also don’t care where data is when it comes to responsibility for keeping it safe: it’s on the company who collects it. Larger companies in more regulated industries face very large, really punitive fines if there’s a data leak—which can lead to severe consequences for the business…and the CISOs responsible.
If CISOs want to not only catch up to but get ahead of business priorities, bad actors, and regulatory requirements, they need to focus on controlling, protecting and minimizing risk to data—wherever it is.
Mar 26
0
min
Birth of the ALTRverse
ALTR Blog
ALTR’s origin story, like that of a lot of companies, starts with pain. In the early 2010s a group of engineers working at a technology company in the options trading space found themselves contending with problems that had no solutions.
They held data that could very easily and quickly be used for personal financial gain by a thief, and they had no tamperproof records of who was accessing it. They had no way to control that access in real time. And worst of all, they had no feasible way to protect data from those who would log straight into their required co-located servers to steal it – while still keeping the data functional for the business.
The reason why these solutions didn’t exist was not because no one had thought of them. At the time, database access monitoring appliances and encryption devices with sophisticated key management were available.
The problem was that the world had changed.
Like Jeff Bridges’ character Flynn in the classic movie TRON, who has his body digitized and finds himself inside of a computer – the world of connected computing had started to detach from its physical roots. Virtualization meant that a computing workload might exist on any device, and the onset of cloud computing and massive proliferation of mobile devices meant that it might not be on a device that was identifiable.
A security model built on a solid physical network topology with endpoints, routers, switches, and servers, was melting away – and with it the first generation of security solutions that were built for that universe.
Taking its place, a logical model. Identities that could be in a café in Turkey or at home in Austin, or hurtling across the sky at hundreds of miles per hour (as the humble author of this post is, right now), accessing workloads and data that also might be anywhere – inside of an “availability zone” instead of on a known server.
This new universe has come very quickly and brought new threats with it, and created a crack between itself and the old one that is literally leaking data. This is because many are trying to fight the new threats with the old tools, looking for the power plug to pull from the wall or the hard drive to wipe when there just isn’t one.
Jeff Bridges didn’t beat the evil Master Control Program by rejecting how his world had suddenly and completely changed. He went with the flow, man, and beat it at its own game.
ALTR’s products are designed for this new world, but this blog isn’t about them. We have a whole website dedicated to that. It’s about exploring the corners of this new universe, highlighting the best ideas from those around the computing and security worlds, and adding our own voice to the conversation. We hope you’ll follow along.
Nov 1
0
min
Get to Your Big Data Analytics Destination
ALTR Blog
Here in Colorado, it’s just about winter sports season. And that means I’m thinking about making the drive up to Summit County to take advantage of some of best skiing anywhere. The destination is completely worth it, but the road is not without its potential risks: slippery inclines, dramatic switchbacks, snow drifts and 18-wheelers barreling through the weather toward the West Coast.
Businesses today face a similar situation: the ability to use data to gather insights across the enterprise is an exhilarating goal, even though getting there can require overcoming some hazards. Many enterprises are moving company information into cloud data platforms like Snowflake in big data analytics projects that take advantage of scalable storage, accessible compute power, and integrations with cloud-based BI tools for a sophisticated view of every part of the business. To get the full picture though, sensitive data must often be included. Whether that’s personal customer data or highly restricted business information, uploading and utilizing that sensitive data creates a risk due to privacy regulations and confidentiality concerns.
But just like the drive up the mountain, there are technologies that can make it easier and safer for companies to make the journey. Here are three examples where ALTR’s technology can help companies reach their big data analytics goals:
Big Data Analytics Insight #1: Determine the real cost to serve customers
The CFO of a logistics company wants to determine the actual costs to serve their customers in order to better align pricing and improve margins. They pull operational data, inventory management, warehousing, and fuel and vehicle maintenance costs into Snowflake. But this doesn't provide a full picture without including the costs of the people doing the work. The last key piece of data is compensation information for each employee involved in delivering the products. However, unlike the other information, this is highly sensitive information about what each individual employee gets paid along with their banking info. Putting it into Snowflake means that it could be accessible to some employees outside the finance and HR teams, like the Snowflake admin, for example.
ALTR automates and makes the handling of this data easy, and because ALTR sits outside Snowflake, we’re able to create a secure mechanism that delivers an alert every time that sensitive data is accessed – by anyone. The logistics company is able to utilize all the required information – even private payroll data – to accomplish their big data analytics goal.
Big Data Analytics Insight #2: Better model customer buying behavior to boost sales
The CMO of a consumer goods company wants to get a holistic view of its customers and their buying behavior, but data is spread across multiple on-premises and cloud-based systems: Salesforce, Marketo, eCommerce sites, backend ERP systems, and customer behavior analytics tools. In order to tie demographic information about specific buyers to their online activity and buying activity, the data all needs to be in one place with at least one common value, usually a piece of PII (name, email, SS#). With this, marketing teams can look for buying indicators in localized regions: perhaps a mom looked at a specific blog post before purchasing diapers in Austin, TX. Maybe that’s a pattern: several moms looked at that post before buying diapers in Austin. Then marketing can use that insight to promote that blog to other moms in Austin, to drive similar purchases.
ALTR allows companies to protect sensitive PII data easily in Snowflake. You can find and classify personal information, see how it’s being used, then set policies to control access and limit consumption in the event of a policy infringement. Marketing teams can safely (and in compliance with privacy regulations) use sensitive data to do multivariate analysis, create an accurate model of customer behavior, and uncover opportunities to grow sales.
Big Data Analytics Insight #3: Discover ways to optimize specific sales territories or business units
As the value of data analytics has grown, the number of people across the business who have or want access has grown in tandem. It’s no longer a handful of data engineers or analysts who can peer deep into every corner of the business but everyone from marketing to finance to HR to engineering to sales who wants to access operational, sales, marketing or finance data to make better decisions. A sales manager may want to get a better view into her territory – looking at past sales and annual trends or hot industries to find opportunities for the next quarter. All of the data to drive these insights will have to be consolidated into a single repository like Snowflake for cross reference, along with the same data for every other sales territory. In order to make her territory data available to that sales manager, the other territory data must be made unavailable so that it stays confidential.
ALTR enables companies to easily set access policies based on role for any data deemed sensitive or confidential. That means your database admin can ensure each sales manager – or any other role in the company – only has visibility into the information they need to do their jobs better. The ability to easily control access means data can be made more freely available.
This Big Data Analytics Journey is All About the Destination
Think of ALTR as a set of snow tires you put on at the beginning of the season, as you head into sensitive data territory. They’re easy to install and equipped to help you make the journey up the mountain whenever you’re ready to go. And the best part is that once you have them, you can stop thinking about the risks and concentrate on the amazing view.
Sep 29
0
min
Get to Data-Driven” Faster with Self-Service Data Governance
ALTR Blog
Growing up in Austin, I’ve been steeped in technology from the start – we were either talking about tech, music, or BBQ basically. And one of the things I heard was that “every company is a software company.” It apparently originated in Watts S. Humphrey’s book, Winning with Software: An Executive Strategy, but was repeated in 2015 by Microsoft CEO Satya Nadella. The idea was really simple: use software to spend less money and/or make more money. Companies wanted to find the most cost-effective source materials, track orders more efficiently, or use the web to reach more customers to increase sales.
Today we’re hearing a new mantra: every company is supposed to be a “data-driven company”. But this is a more difficult transition that requires we overcome new regulatory roadblocks. Luckily, we have a winning strategy from the past to guide us.
The low-code/no-code path to becoming a “software company”
There weren’t really any external barriers to becoming “software companies.” Companies just had to hire people who could create software, and they were off and running. The only real limit was knowledge about what software could do and the number of people available who could write the code. Some industries including financial, healthcare and tech were well ahead of this technology curve. But other industries like consumer goods or manufacturing had to catch up. This created a “skills gap” that initially forced companies to hire expensive consultants to come in, evaluate the business, tell them where technology could improve service or increase sales, scope the technology needed, and then deliver it. This might help a company make more money in the long run or increase efficiencies in certain areas, but it certainly wasn’t quick or cheap.
Companies eventually overcame this hurdle with the advent of low-code or no-code tools — basically software that could do the code writing for users. WYSIWYG (what you see is what you get) tools like WIX or Squarespace for websites are one example. Before these self-service tools, company websites were owned and managed by the IT group because they required server set up and Java and HTML code to create and manage. Once these easy-to-use tools rolled out, employees across the enterprise gained the power that only coders and developers had before. The people who own making that part of the business more modern could do it themselves with these tools, creating an inflection point in technology adoption. The marketing team could now manage their own website, and a typo on the home page no longer had to be a major crisis.
Privacy regulations create a headwind to becoming data-driven
The next generation of this idea is that “every company is a data-driven company” — it’s the latest strategy for companies to make more money or save more money. It makes sense — after companies developed and rolled out all that software, they started ingesting and generating a ton of data about the business and their customers. Companies now have personal information on millions of users, data on how those users use their software, where, when and how. Software also collected digital healthcare data, financial data, purchasing data, logistics data, mobile data, Internet of Things data. Just about everything that exists today – person or object – could have data associated with it that is collected and stored.
While companies are racing to optimize and monetize this amazing resource with technology to uncover insights and share those across the business, they face a new headwind they didn’t in the transition to becoming a software company: data privacy regulations. There were no regulations around writing code, but seemingly every month more and more laws come out directing how PII, PHI, and PCI data can be stored and shared. The E.U.’s GDPR data privacy law went into effect in 2018, California’s CCPA in 2020 and, this year, Virginia and Colorado passed regulations that protect consumer privacy. These regulations come with steep penalties for data leaks or misuse—California’s privacy act fines, for example, can range from $2,500 to $7,500 per record.
No-code, low-code software makes data governance self-service and turns a headwind into a tailwind
Overcoming this headwind means putting something in place to comply with these regulations. Like the industries that started behind in the transition to becoming a software company, many companies today may be early in their journey to data governance – according to TDWI research just 25% have a data governance program and 44% are only in the design phase. The rest aren’t even thinking about it yet. And they face a similar skills gap. Several of today’s governance technologies are based on legacy infrastructure that not only involve big investments in time, money, and human resources to implement, but also require expensive developers to set up and maintain. Because, guess what, just like the early days of software, they need people who can code!
The good news is that we already know the solution to that challenge: create no-code/low-code tools that allow non-coders to rollout and manage the data governance solution themselves. This is where ALTR is ahead of the curve. Our cloud-based data access control and security platform requires no code to set up or maintain. Any user can easily automate policy enforcement with a few clicks in the ALTR interface, immediately see how sensitive data is being consumed, and document that access and usage to comply with all relevant privacy regulations. No one needs to know SnowSQL, Apache Ranger, YAML or any other code – the activation of governance policies and controls can be handed off to the data governance teams or any other non-coders to implement and manage. Not only does the process become faster, it becomes less error prone. Governance teams can see that the policies are working correctly with their own eyes, and they can adjust immediately if there’s something off – just like a marketing team can fix a typo on their website.
By delivering data control and protection solutions that anyone can use, we’re allowing data governance to be self-service and enabling companies to utilize sensitive data because anyone can implement the necessary controls and protections.
Becoming Data-Driven Faster
And now with our ALTR Free (forever) plan, we’re truly democratizing data governance: we’re removing all the skills, expense or resource roadblocks standing in the way, so that everyone can control and protect data their data easily, swiftly and freely. Companies can turn the headwind into a tailwind to more quickly get more value out of data and become the “data-driven companies” they need to be.
Jun 15
0
min
ALTR Partners with DataSwitch to Enable Secure Digital Data Transformation
ALTR Blog
The concept of “digital transformation” must be familiar to just about everyone in the business world at this point. Our applications and activities continue their migration from on-premises into the cloud at a rapid rate. With this comes the idea of “digital data transformation”. Data isn’t just moved as is, but should be re-imagined, remodeled and modernized to get the most from the cloud data platforms that will be hosting it.
To this end, ALTR has partnered with emerging digital data transformation firm DataSwitch.
DataSwitch's no-code and low-code platform along with cloud data expertise and unique, automated schema generation accelerate time to market. DataSwitch provides an end-to-end data, schema and ETL process migration with automated re-platforming and refactoring, thereby delivering faster time to market and significant reduction in cost of migration.
The DataSwitch Migrate toolkit leverages advanced automation to migrate schema, data, and processes from legacy databases like Oracle, Teradata, Netezza, and SQL server and ETL tools like Informatica, Datastage, and SSIS to modern cloud-based data warehouses like AWS Redshift, Snowflake, and Google BigQuery and integration platforms like Databricks, Spark, SnapLogic, and Matillion, etc. And all this is delivered with built-in technology best practices.
DS Migrate automates data transformation by implementing three key components which include:
- Schema Redesign: Intuitive, predictive and self-serviceable schema redesign and restructuring from old school data models to new generation data models (including JSON and Graph).
- Data Migration: Enhanced automation, cloud expertise and automated schema generation to accelerate data migration.
- Processes Conversion: No-touch code conversion from legacy data scripts and ETL tools to modern database scripts and ETL tools, through an adoptive Translator + Compiler design.
With the full suite of DataSwitch tools, enterprises can scale their decision-making, automate manual processes, and simplify complex analyses, thereby accelerating time to translate data into real-world insights and build their scalable Cloud Data Platforms.
Furthermore, with experience dealing with the regulated industries DataSwitch in collaboration with ALTR is investing in domain-specific solution use cases to address needs which include business process automation, customer 360, institutional and market risk identification, and fraud detection.
For customers with sensitive data, migration to the cloud can be a little trickier due to privacy regulations. Integrating the DS Migrate toolkit with ALTR helps companies comply with the relevant rules through seamless tokenization of sensitive data as it is migrated, protecting it from risk of theft or leaks. Sensitive datasets of structured, semi-structured, and unstructured data can be tokenized without complex and lengthy software installation. There are no keys to maintain, and no maps to reduce the security of the data. And using ALTR’s cloud platform, tokenized data can be accessed from anywhere you allow. With ALTR, privacy, risk, compliance, data, and security teams work together to govern and automatically control access to sensitive data, simplifying role management, and ensuring data flows to whoever needs it while private information stays protected.
By combining ALTR and DataSwitch, data engineering teams can quickly and easily modernize their data architecture, migrate sensitive data to cloud data platforms safely, and share data with anyone who needs it while ensuring data privacy and security.
To learn more about how DataSwitch and ALTR can help your business, request a demo.
Oct 20
0
min
ALTR Attains Snowflake Premier Partner Status
ALTR Blog
In just seven months since we launched our integration with Snowflake to deliver the market’s first cloud-native platform for observability, governance and protection of sensitive data, we’ve seen tremendous growth. We released innovative product features including automated data usage visualizations, launched the data governance industry’s first and only Free plan available via Snowflake Partner Connect, and seen significant customer momentum. And now we’re pleased to announce that the work we put into developing our Snowflake practice has resulted in ALTR attaining Premier Partner status in the Snowflake Partner Network.
We designed our solution for Snowflake to make it easy for enterprises of all sizes to take full advantage of the Snowflake platform. With our no-code, cloud-based solution, utilizing Snowflake’s native governance and security features, companies can enjoy enterprise-level data control and protection that is easy to implement and easy to manage and maintain, ensuring that valuable sensitive data can be both utilized across the business while staying secure.
Shared customers take notice
Shared customers like innovative functional nutrition and supplement supplier, HumanN, are reaping the rewards. The consumer goods company utilizes Snowflake as a central database to analyze customer data from a multitude of cloud and on-premises systems. Adding ALTR allowed HumanN to include sensitive customer data into its analytics while ensuring the data is safe and in compliance with privacy regulations.
“Everything we do at HumanN is driven by our desire to help people—to push harder, to achieve greater and finish stronger,” said Kocher. “We treat our customers’ data with the same respect we treat the humans behind the data. Our adoption of ALTR is another step in that direction.”
Award-winning consumer activation company, Welltok is also seeing the benefits of being an ALTR/Snowflake customer. “Data security and privacy is of the utmost importance to Welltok and our clients, especially when it comes to health information,” said David MacLeod, CIO and CISO for Welltok. “Working with ALTR helps us ensure safety with real-time monitoring and action.”
Get started with the industry’s leading data control and protection solution for Snowflake. Get ALTR Free!
Apr 13
0
min
The Age of Data: Insights From Recent Events and What's Coming Next
ALTR Blog
Last week, ALTR CTO and co-founder James Beecham had a Zoom call with Angelbeat’s Ron Gerber to chat about some data-related themes like the complexity of data, why data is becoming the new endpoint when you think about security, how PII is becoming the new PCI, and the challenges around using and securing sensitive data in your cloud data platform. Those themes and more are featured in the above video by way of preparing for the upcoming Angelbeat Data event in June (more to come on that soon).
Speaking of events, they’re a pretty big deal for us at ALTR; with all the knowledge sharing and networking, they’re mutually beneficial for vendors and attendees alike. From webinars and virtual events to the long-lost in-person trade shows, we consider ourselves fortunate to be able to work with folks like Ron Gerber of Angelbeat, our customers over at Q2, partners at Snowflake, and all the other experts who share their experiences and insight into the data security space.
In addition to speaking with Ron, James also recently hosted a webinar on “A Security-First Approach to Re-Platforming Data in the Cloud” with Q2’s CAO, Lou Senko, and Snowflake’s Head of Cyber Security, Omer Singer. This webinar not only demonstrates ALTR’s cloud integration with ALTR, but it also provides real use cases from our customer Q2.
On top of the availability of our Snowflake integration (used by Q2, The Zebra, and HumanN to name a few), we are excited to announce our latest integration with OneTrust. OneTrust unifies data governance under one platform, streamlining business processes and ensuring privacy and security practices are built-in to your data governance program.
Together, the integration between OneTrust and ALTR further simplifies data governance by automating the enforcement of governance policy. Now organizations can automatically and continuously discover, classify, and govern access to sensitive data. Sign up here to see a live demonstration of this integration on May 12th.
Along with periodic webinars with industry experts, customers, and partners, we’re also pleased to let you know that we will be participating in Dataversity’s 25th Annual Enterprise Data World event at the end of April as well as RSA’s annual conference in May. As life starts to get closer and closer to normal, we can’t wait to start seeing you all out at in-person conferences later in the year.
To keep up with all the events going on at ALTR, check out our events page, which is always up-to-date with where you can find us. If you are ready to get started with ALTR, you can try it for free or request a demo.
Oct 12
0
min
IDG Roundtable: Data, Security and Visibility: How to Minimize Risk in a Time of Rapid Business Change
ALTR Blog
ALTR teamed up with IDG to host a discussion entitled Data, Security and Visibility: How to Minimize Risk in a Time of Rapid Business Change. The aim was to share best practices and challenges around:
- The ongoing struggle between security needs and innovation goal and how the pandemic has added to that tension
- How remote work has increased risk and security exposures, specifically related to insider threats and credentialed breaches
- The importance of observability to understand how data is being consumed in order to establish patterns and quickly recognize risky applications and abnormal consumption
- How to distinguish between security at the device, application, and database level
- Re-evaluating priorities and making effective decision when it comes to security and data protection
With participants from an array of different industries, job functions, and project priorities, it was interesting to learn about their specific goals and challenges, but in the end it was evident that the group had a lot more in common than we anticipated.
Key takeaways:
1. The business owns and understands the data, making it increasingly more challenging for IT to protect the data.
One participant pointed out that whenever someone needs data within the company, they ask IT. While that seems like a logical place to start, it’s usually the business that actually owns and understands the data. While they usually end up finding it (from the business), it is not an efficient use of time and resources. The first step to solving this problem find a single platform to bridge that gap, providing observability and logging all consumption. This will allow IT to maintain protection of the data while also being able to curate it in a timely manner.
2. Remote work has all but dissolved the traditional perimeter for any organization or enterprise that still had strong network-based security.
A member of the group shared his story of when the pandemic started and work from home became mandatory. Remote access, which for many organizations might have been a small percentage of their work force, suddenly became the only way that workers were using the organization’s resources. Strong network-focused security postures needed to adjust overnight into more data-centric approaches.
3. Data security is still far too dependent on the infrastructure on which it resides – the cloud has made the problem worse because cloud providers try to provide differentiated toolsets.
This conclusion came out of a discussion around security tools canceling out many of the reasons to leverage the cloud in the first place. Easy to get started, no hardware to install, and the ability to scale quickly are what make the cloud so appealing. So why shouldn’t your security solutions work the same way?
Many newer, more advanced security products are less bound to a specific infrastructure, which means that they can function across hybrid environments and simplify the complex mix of products. This simplification driven by the cloud has become a priority for security leaders.
4. Security clouds are becoming a cost-effective reality with cloud data platforms like Snowflake, but organizations are still overwhelmed with the amount of security data they are collecting.
Cloud data platforms have dramatically improved the speed, efficiency, and flexibility of collecting and analyzing data to power the modern data-driven enterprise. But ease of use and greater access to collected data has presented new challenges in terms of managing data consumption. The modern data ecosystem starts with core applications that create and use massive amounts of data every day. Along the way data is shared, both inbound data from third party sources and outbound data shared with close partners.
By first observing data consumption you can understand how data is being consumed to understand patterns and create baselines. It also reveals high risk applications that you should probably focus on. Once you understand how the data is being consumed can begin to actual govern the consumption. Using this approach, you ensure your data is safe while keeping it accessible for the business to do their jobs.
5. An organization's data security approach must be unique because it’s dependent on the type of data they have and how it needs to be accessed.
One attendee at the round table worked at a design firm that deals in very large files that are sensitive because they contain important intellectual property, while another worked for a large insurance company that deals with large structured databases that contain PII. Every organization has unique need and challenges, but they have all been affected by remote work and now have data traversing the Internet far more than before.
Get the latest from ALTR
Subscribe below to stay up to date with our team, upcoming events, new feature releases, and more.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.