Information and know-how have gotten extra essential to the hashish market because it matures, and insurance coverage professionals specializing within the section would profit from holding that at prime of thoughts.
A gaggle of specialists gathered throughout Insurance coverage Journal’s Insuring Hashish Summit to debate rising alternatives and notable insurance coverage focus areas. The panel was hosted by Charles Pyfrom, chief advertising officer at CannGen Insurance coverage Providers. Panelists shared their views on information implementation and the significance of sustaining shut contact with purchasers all through coverage lifecycles.
Regardless of disappointing hashish gross sales in long-legal states within the West, hashish information continues to be a mixed-bag, with speedy progress from newer states within the Midwest and the East. Gross sales figures additionally level to long-term progress for the trade as an entire.
Authorized hashish trade retail gross sales will develop to greater than $33.5 billion in 2023, a recent Marijuana Business Daily forecast showed.
New and increasing markets are driving speedy progress. And because the trade matures, insuring hashish specialists are leaning on amassed information to align insurance policies with enterprise projections and loss histories.
“What I’m seeing is that there are insureds which are underinsured,” stated Curtis Prince, CEO and founding father of Flux Insurance coverage Providers. “So, there’s nonetheless a whole lot of insurance coverage to be offered.”
Many hashish operators are first-time enterprise homeowners who could not absolutely perceive what comes with being an employer. In addition they could not perceive confirmed threat administration, nor could they perceive the exposures that endanger their livelihoods.
Along with securing required basic legal responsibility protection, Prince inspired retailer brokers to interact in employees’ compensation conversations “proper out the gate.” Workers have the potential to be a hashish enterprise’ largest legal responsibility, so making employees’ comp a spotlight “additionally helps you get extra immersed with the enterprise proprietor and what the enterprise proprietor has by way of ache,” he stated.
Jesse Parenti, founding father of JP Squared Consulting, believes subsequent greatest space of focus must be automotive protection, which “is the one factor you will have in your portfolio that may create an extinguishing occasion.”
Telematics may help hashish operators handle, monitor and see their product-moving fleets, Parenti stated.
Prince known as telematics “an incredible HR instrument” that enables enterprise homeowners to be proactive in discipling unsafe drivers. He stated it additionally supplies in-depth information concerning claims that’s “way more beneficial than simply conventional loss historical past as a result of they’re in a position to confirm that is precisely what occurred with that car throughout that occasion.”
He later known as telematics “the way forward for hashish distribution.”
As a result of so many hashish companies are new, panelists highlighted the worth of transparency and thoroughly explaining how the insurance coverage mannequin works. They burdened the significance of holding in shut contact with policyholders — not popping up solely at renewal time.
Whereas sharing protection info early is essential, relying on the circumstances, purchasers within the authorized hashish house could not want each protection on day No. 1.
“You may have the very best capability with an rising market to truly cross-sell proper if you bind the protection,” Prince stated, “after which put a service plan in place to have the ability to interact with them ongoing and assist them be proactive.”
Information and Backside Strains
Information influences provider threat appetites, hastens underwriting processes and might finally result in appropriate coverages that align with correct forecasting. Setting reasonable targets is essential when securing protection for potential policyholders. In brief, higher information results in higher outcomes, the panelists agreed.
“I’m actually huge on appropriate forecasting,” Parenti stated. “I believe the largest factor is folks’s ego, and over-projecting bites them actually onerous. They usually don’t actually know the ramifications till they really feel that ache.”
Flux’s platform has empowered brokers to higher support purchasers in forecasting the insurance coverage prices of latest ventures and keep away from hitting backside traces, in keeping with Prince.
Telematics, GPS and sprint cam recordings saved one among Parenti’s purchasers from being hit with a $12 million lawsuit that will have ended the enterprise.
“Information and know-how is basically the place this trade goes,” Parenti stated. “One of many advantages of hashish is it embraces know-how the place most industries don’t. So, we have now that in our favor. The problem is that within the business insurance coverage world, it actually lags when you will have that hole between know-how and what’s wanted.”
Max Meade, insurance coverage advisor at Brown & Brown Insurance coverage, makes use of information to allow him to inform purchasers how they need to construct insurance coverage prices into P&L predictions.
“So, I can present them, once more from an trade common, what’s occurring,” Meade stated. “The forms of claims, the fee — so it’s extra of a visualization for them. That offers them a greater understanding of, ‘Okay, this insurance coverage responds to this, this does this.’”
Pyfrom added: “The frequent theme that I hope everybody heard right here is information is essential.”
Go Deeper
The complete webinar contains in-depth info concerning delta-8 and CBD, notable auto insurance coverage exposures and the cultural shifts propelling the authorized hashish trade’s progress. Entry it or get extra info on the summit’s website.
IBM sees a confluence of generative artificial intelligence and APIs, with AI powering APIs in a way that improves the productivity of API teams.
AI is augmenting skills that API teams may just be starting to learn, said Rashmi Kaushik, director of product management for the integration portfolio at IBM, during a presentation at the API World conference in Santa Clara, California, on November 6. “It’s able to help them complete their API projects faster.” Also, APIs are powering AI, she added. APIs empowering AI and the rise of AI assistance are truly beneficial to API teams, Kaushik said.
Companies such as IBM have released API testing capabilities on traditional AI. But AI is not magic. It has been a technology in the making for many years now and it is here to transform the way business is done, Kaushik said. Regardless of how much AI is leveraged, users want to make sure that it is safe, responsible, and ethical, she said.
IBM offers the API Assistant for IBM API Connect, powered by the watsonx.ai integrated AI platform. It uses generative AI to help API teams accelerate API life-cycle activities for a quicker time to market, the company said. IBM API Assistant automates tasks, enabling teams to focus on higher-value work and innovation, according to IBM. API assistants are able to augment API teams, so they progress faster, Kaushik said.
Both proposals warn of the threat posed to information security by advancements in the field of quantum computing. A future large-scale quantum computer could use Shor’s algorithm to compromise the security of widely deployed public-key-based algorithms. Such algorithms are used by the Java platform for activities such as digitally signing JAR (Java archive) files and establishing secure network connections. An attack could be accomplished by a quantum computer using Shor’s algorithm in hours. Cryptographers have responded to this threat by inventing quantum-resistant algorithms that cannot be defeated by Shor’s algorithm. Switching to quantum-resistant algorithms is urgent, even if large-scale quantum computers do not yet exist.
Each of the two proposals is eyed for the Standard Edition of Java, but neither is targeted for a specific version at this point. Both proposals were created August 26 and updated November 6.
Despite these issues, the hype train was at full speed. For example, a large provider took issue with me pointing out some of the shortcomings of this “new” serverless technology. Instead of addressing the problems, they called for my immediate firing due to blasphemous comments. I hit a nerve. Why was that? The cloud providers promoting serverless should have had more confidence in their technology. They knew the challenges. I was right about serverless then, right when I wrote its decline. However, I’m always willing to reevaluate my position as technology evolves. I believe in redemption.
A technological comeback
Despite its early hurdles, serverless computing has bounced back, driven by a confluence of evolving developer needs and technological advancements. Major cloud providers such as AWS, Microsoft Azure, and Google Cloud have poured substantial resources into serverless technologies to provide enhancements that address earlier criticisms.
For instance, improvements in debugging tools, better handling of cold starts, and new monitoring capabilities are now part of the serverless ecosystem. Additionally, integrating artificial intelligence and machine learning promises to expand the possibilities of serverless applications, making them seem more innovative and responsive.
Java application security would be enhanced through a couple of proposals to resist quantum computing attacks, one plan involving digital signatures and the other key encapsulation.
The two proposals reside in the OpenJDK JEP (JDK Enhancement Proposal) index. One proposal, titled “Quantum-Resistant Module-Lattice-Based Digital Signature Algorithm,” calls for enhancing the security of Java applications by providing an implementation of the quantum-resistant Module-Latticed-Based Digital Signature Algorithm (ML-DSA). Digital signatures are used to detect unauthorized modifications to data and to authenticate the identity of signatories. ML-DSA is designed to be secure against future quantum computing attacks. It has been standardized by the United States National Institute of Standards and Technology (NIST) in FIPS 204.
The other proposal, “Quantum-Resistant Module-Lattice-Based Key Encapsulation Mechanism,” calls for enhancing application security by providing an implementation of the quantum-resistant Module-Lattice-Based Key Encapsulation Mechanism (ML-KEM). KEMs are used to secure symmetric keys over insecure communication channels using public key cryptography. ML-KEM is designed to be secure against future quantum computing attacks and has been standardized by NIST in FIPS 203.
IBM sees a confluence of artificial intelligence and APIs, with AI powering APIs in a way that improves the productivity of API teams.
AI is augmenting skills that API teams may just be starting to learn, said Rashmi Kaushik, director of product management for the integration portfolio at IBM, during a presentation at the API World conference in Santa Clara, California, on November 6. “It’s able to help them complete their API projects faster.” Also, APIs are powering AI, she added. AI empowering APIs and the rise of AI assistance are truly beneficial to API teams, Kaushik said.
Companies such as IBM have released API testing capabilities on traditional AI. But AI is not magic. It has been a technology in the making for many years now and it is here to transform the way business is done, Kaushik said. Regardless of how much AI is leveraged, users want to make sure that it is safe, responsible, and ethical, she said.
You start with an existing project and the details of build tools and frameworks, along with a target Java version (for example upgrading from Java 8 to Java 21). The Copilot upgrade assistant analyses your code base and generates a list of the necessary steps to run your upgrade, presenting it as a set of GitHub issues that you can check before running the update.
Once you’re happy with the tasks, the tool takes you to a dashboard where you can watch the update process, including how Copilot rewrites code for you. You can stop and start the process at any time, drilling down into tasks for more information on just how the AI-based code is working. It’s good to have this level of transparency, as you need to be able to trust the AI, especially when it’s working on business-critical software.
As this is an agentic AI process, the service can detect errors and fix them, launching sub-agents that make changes, rebuild, and retest code. Interestingly if a fix doesn’t work, it’ll take another approach, using the shared knowledge of the Java developers whose work has been used to train the Copilot Java model. Like other GitHub Copilots, changes that work are used to fine-tune the model, reducing the risk of errors in future runs. That goes for manual updates and changes too.
In other words, there is no single address, IP, or server to block. That said, there are downsides to the technique that are not mentioned by Checkmarx, including the fact that blockchain communication is slow, as well as public. The blockchains can’t be edited, or blocked easily, but they can be tracked once their use as part of malware C2 has been uncovered.
Despite past predictions that the technique would take off, this is probably why using blockchains for C2 remains the experimental preserve of specialist malware.
Package confusion
Perhaps the more significant part of the story is that the technique is being used to target testing tools distributed via NPM, the largest open source JavaScript registry. Targeting testing tools is another way to get inside the privileged developer testing environments, and any deeper access to the CI/CD pipelines that they reveal.
Microsoft has introduced its Microsoft.Extensions.VectorData.Abstractions library, now in preview. The library provides abstractions to help integrate vector stores into .NET applications and libraries.
The vector data abstractions library, introduced October 29, provides library authors and developers with the ability to perform create-read-update-delete (CRUD) operations and use vector and text search on vector stores.
Vector databases are important for search tasks and grounding AI responses, Microsoft said. These databases are built to store, index, and manage data represented as embedding factors. As a result, the indexing algorithms used by vector databases are optimized to retrieve data that can be used downstream in applications.
import pandas as pd
data = {
"Title": ["Blade Runner", "2001: a space odyssey", "Alien"],
"Year": [1982, 1968, 1979],
"MPA Rating": ["R","G","R"]
}
df = pd.DataFrame(data)
Applications that use dataframes
As I previously mentioned, most every data science library or framework supports a dataframe-like structure of some kind. The R language is generally credited with popularizing the dataframe concept (although it existed in other forms before then). Spark, one of the first broadly popular platforms for processing data at scale, has its own dataframe system. The Pandas data library for Python, and its speed-optimized cousin Polars, both offer dataframes. And the analytics database DuckDB combines the conveniences of dataframes with the power of a full-blown database system.
It’s worth noting the application in question may support dataframe data formats specific to that application. For instance, Pandas provides data types for sparse data structures in a dataframe. By contrast, Spark does not have an explicit sparse data type, so any sparse-format data needs an additional conversion step to be used in a Spark dataframe.
To that end, while some libraries with dataframes are more popular, there’s no one definitive version of a dataframe. They’re a concept implemented by many different applications. Each implementation of a dataframe is free to do things differently under the hood, and some dataframe implementations vary in the end-user details, too.
Java’s internal systems and syntax are constantly evolving, and these changes happen primarily through the Java Community Process (JCP) and Java Enhancement Proposals (JEPs). Together, the JCP and JEPs define the path by which new features can be described, designed, and—hopefully—introduced into the JVM. They keep the Java language and platform dynamic and the community engaged. With JDK 24 so close to its planned release date, now is a good time to take a look at the upcoming JEPs making their way through the process.
Stages of the JEP process
You can check the JEP Index on the OpenJDK homepage for a catalog of all the JEPs, past and present, submitted for Java. The sheer size and scope of the index can be overwhelming at first. It’s not immediately clear what each JEP is about and which ones are more significant. You might expect major projects like virtual threads would be distinguished from proposals of smaller scope, but there is no such distinction. All the JEPs are listed, past and present, showing the complete history of Java’s evolution.
Instead of scope, the JEPs are organized according to their stage of development: