Popular Posts

Popular Content

Powered by Blogger.

Search This Blog

Follow on Google+

Recent Posts

About us

Nearly a dozen Senate Democrats wrote to Google this week with questions about how it deletes users' location history when they have visited sensitive locations such as abortion clinics, expressing concerns that the company may not have been consistently deleting the data as promised.

from CNN.com - RSS Channel https://ift.tt/8XhBOcC
Continue Reading

A federal court in Central Islip, New York sentenced William Junior Maxwell II, also known as rapper Fetty Wap, to six years' imprisonment and five years of post-release supervision for conspiracy to distribute cocaine on Wednesday, according to a news release from the US Attorney's Office in the Eastern District of New York.

from CNN.com - RSS Channel https://ift.tt/hcNLro5
Continue Reading

My cofounder and I used to work at Robinhood where we shipped the company’s first OAuth integrations, so we know a lot about how data moves between companies.

For example, we know that the pain of building new API integrations scales with the fragmentation in the industry. In the current meta, we see this pain with a lot of AI startups who invariably need to connect to their customers data, but have to support 50+ integrations before they even scale to 50+ customers.

This is the process for an AI startup to add a new integration for a customer:

- Pore over the API docs for each source application and write a connector for each - Play email tag to find the right stakeholders and get them to share sensitive API keys, or give them an OAuth app. It can take 6+ weeks for some platforms to review new OAuth apps - Normalize data that arrives in a different formats from each source (HTML, XML, text dumps, 3 different flavors of markdown, JSON, etc) - Figure out what data should be vectorized, what should be stored as SQL, and what should be discarded - Detect when data has been updated and synchronize it - Monitor when pipelines break so data doesn’t go stale

This is a LOT of work for something that doesn’t move the needle on product quality, so it’s no wonder that most startups are still relying on file uploads to onboard their early customers. The problem is file uploads are slow and insecure and don't scale.

That’s why we built Psychic.dev to be the fastest and most secure way for startups to connect to their customer’s data. You integrate once with our universal APIs and get N integrations with CRMs, knowledge bases, ticketing systems and more with no incremental engineering effort.

We abstract away the quirks of each data source into Document and Conversation data models, and try to find a good balance to allow deep integrations while preserving general purpose utility. Since it’s open source, we encourage founders to fork and extend our data models to fit their needs as they evolve, even if it means migrating off our paid version.

To see an example in action, check out our demo repo here: https://github.com/psychic-api/psychic-langchain-tutorial/

We are also open source and open to contributions, learn more at docs.psychic.dev or by emailing us at founders@psychic.dev!


Comments URL: https://news.ycombinator.com/item?id=36032081

Points: 10

# Comments: 0



from Hacker News: Front Page https://ift.tt/1JEzIOr
Continue Reading

Here is a screenshot of my Bitwarden: https://ift.tt/qcdoY4N

They include some really important things such as:

Health insurance G-Suite for work Bill.com (which I use to get paid) IRS.gov (which I use to get un-paid) UK Companies House Register Interactive Brokers My bank

Obviously, anything with OAuth is "bundled" into my Google account. So if anything this is a huge underestimate.

I'm asking because of how insane auth has become. I know companies like OnePassword and Bitwarden are working on this and overall they do a great job. But I still have a near-stroke every time I have to do the "forgot my password" loop, or use Duo Mobile/other 2FA.

The only really good auth feature I've ever encountered has been Apple's "fill from Messages" feature as well as their Touch.


Comments URL: https://news.ycombinator.com/item?id=36020008

Points: 43

# Comments: 73



from Hacker News: Front Page https://ift.tt/w240lvk
Continue Reading

The Internal Revenue Service is weighing whether to build its own free tax filing system and plans to launch a limited pilot program that will be available to some taxpayers next year during the 2024 tax filing season.

from CNN.com - RSS Channel https://ift.tt/93f1thH
Continue Reading

Hey HN, my name is Vikas, and my cofounders Rish, Gabe and I are building Openlayer: http://openlayer.com/

Openlayer is an ML testing, evaluation, and observability platform designed to help teams pinpoint and resolve issues in their models.

We were ML engineers experiencing the struggle that goes into properly evaluating models, making them robust to the myriad of unexpected edge cases they encounter in production, and understanding the reasons behind their mistakes. It was like playing an endless game of whack-a-mole with Jupyter notebooks and CSV files — fix one issue and another pops up. This shouldn’t be the case. Error analysis is vital to establishing guardrails for AI and ensuring fairness across model predictions.

Traditional software testing platforms are designed for deterministic systems, where a given input produces an expected output. Since ML models are probabilistic, testing them reliably has been a challenge. What sets Openlayer apart from other companies in the space is our end-to-end approach to tackling both pre- and post-deployment stages of the ML pipeline. This "shift-left" approach emphasizes the importance of thorough validation before you ship, rather than relying solely on monitoring after you deploy. Having a strong evaluation process pre-ship means fewer bugs for your users, shorter and more efficient dev-cycles, and lower chances of getting into a PR disaster or having to recall a model.

Openlayer provides ML teams and individuals with a suite of powerful tools to understand models and data beyond your typical metrics. The platform offers insights about the quality of your training and validation sets, the performance of your model across subpopulations of your data, and much more. Each of these insights can be turned into a “goal.” As you commit new versions of your models and data, you can see how your model progresses towards these goals, as you guard against regressions you may have otherwise not picked up on and continually raise the bar.

Here's a quick rundown of the Openlayer workflow:

1. Add a hook in your training / data ingestion pipeline to upload your data and model predictions to Openlayer via our API

2. Explore insights about your models and data and create goals around them [1]

3. Diagnose issues with the help of our platform, using powerful tools like explainability (e.g. SHAP values) to get actionable recommendations on how to improve

4. Track the progress over time towards your goals with our UI and API and create new ones to keep improving

We've got a free sandbox for you to try out the platform today! You can sign up here: https://app.openlayer.com/. We are also soon adding support for even more ML tasks, so please reach out if your use case is not supported and we can add you to a waitlist.

Give Openlayer a spin and join us in revolutionizing ML development for greater efficiency and success. Let us know what you think, or if you have any questions about Openlayer or model evaluation in general.

[1] A quick run-down of the categories of goals you can track:

- Integrity goals measure the quality of your validation and training sets

- Consistency goals guard against drift between your datasets

- Performance goals evaluate your model's performance across subpopulations of the data

- Robustness goals stress-test your model using synthetic data to uncover edge cases

- Fairness goals help you understand biases in your model on sensitive populations


Comments URL: https://news.ycombinator.com/item?id=35951703

Points: 15

# Comments: 5



from Hacker News: Front Page https://ift.tt/AiuzOs6
Continue Reading

Jamie Komoroski's blood alcohol level was over three times the legal limit when she allegedly drove her car into a golf-cart style vehicle carrying a newly married couple away from their wedding reception, killing the bride, a toxicology report shows.

from CNN.com - RSS Channel https://ift.tt/yk0fw6W
Continue Reading

View CNN's Fast Facts and learn more about the life of Broadway Joe, Pro Football Hall of Fame quarterback Joe Namath.

from CNN.com - RSS Channel https://ift.tt/MbBuEjn
Continue Reading

Gabby Petito's parents are continuing to ask a Florida judge to order Brian Laundrie's mother to turn over an undated letter she wrote to her son that was in his backpack when his body was found, alleging it references "baking a cake with a shiv in it should Brian Laundrie go to prison."

from CNN.com - RSS Channel https://ift.tt/egw4rHs
Continue Reading

This is a project I've been working on for a little while and I'm interested in your feedback and point of view.

The Domain Verification protocol stores a DNS TXT record at a DNS name derived from a hashed "verifiable identifier" (email, telephone, DID), enabling anyone that can prove control over the verifiable identifier to prove authority for the domain name, whilst preserving the privacy of the authorised party.

Once setup, the record enables automatic domain verification for any service provider.

This record could be automatically setup by domain registrars upon domain registration (with registrant opt-in) creating a fast lane for verification with service providers many new small businesses use (eg Google Ads, Facebook, Office365, Dropbox, etc).

=====

Many of us would have verified a domain name by pasting a string into a DNS TXT record. These methods are currently being discussed and standardised at the IETF [2].

Let's Encrypt's DNS-01 method [3] is probably considered the state of the art. The differences between DNS-01 and Domain Verification protocol are:

- DNS-01 requires a new TXT record for each service provider. With Domain Verification Protocol, multiple service providers can use the same record.

- Instructions to setup a DNS-01 TXT record are instigated by the service provider, whereas a Domain Verification Protocol record can be setup independently by a user or a domain registrar. They could even pre-populated by a registrar upon domain registration (with registrant opt-in)

- There’s no concept of permissions in DNS-01, the act of creating the record gives the user full access for the domain with the service provider. With Domain Verification protocol multiple records can be setup, limited permissions could be setup for different third parties. For example give a marketing agency authentication to claim the domain on social media but nowhere else.

I'm still working on licensing but creating these records will always be free. I hope to find service providers that see significant upside in reducing friction for user onboarding that are willing to pay to license it.

Worked example: Let's say you want to authenticate the user with the email user@example.com with the domain dvexample.com, these are the steps:

a. HASH(user@example.com) -> 4i7ozur385y5nsqoo0mg0mxv6t9333s2rarxrtvlpag1gsk8pg

b. Store Domain Verification record at: 4i7ozur385y5nsqoo0mg0mxv6t9333s2rarxrtvlpag1gsk8pg._dv.dvexample.com

c. TXT record determines permissions and time limit:

@dv=1;d=Example user email;e=2025-01-01;s=[seo;email];h=4i7ozur385y5nsqoo0mg0mxv6t9333s2rarxrtvlpag1gsk8pg

Thanks for taking a look,

Elliott

1. https://news.ycombinator.com/item?id=35827952

2. https://datatracker.ietf.org/doc/draft-ietf-dnsop-domain-ver...

3. https://letsencrypt.org/docs/challenge-types/

=====

Quick sidebar:

This was originally submitted to HN under the title "Show HN: Make domain verification as easy as verifying an email or phone number" 3 days ago [1]. It was doing really well (#3 on front page) then totally disappeared from front page and went to bottom of page 1 of Show HN.

After an email exchange with dang (incredibly helpful as always), he explained that it got flagged with the "overheated discussion detector" and it turned out I caused this by diligently responding to every comment as fast as my fingers would type because I wanted to keep engagement going. Helpfully dang took the flag off it about 12 hours later after our email exchange, but understandably the momentum was lost.

So I feel like it kinda got killed, just as it was picking up pace and as the US west coast was waking up. So I am humbly reposting it with a modified description based on the comments of the last post.


Comments URL: https://news.ycombinator.com/item?id=35864114

Points: 15

# Comments: 3



from Hacker News: Front Page https://ift.tt/R4ZbaXj
Continue Reading

North Carolina Lt. Gov. Mark Robinson, the current Republican favorite to be the party's nominee for governor in 2024, has a long history of remarks viciously mocking and attacking teenage survivors of the 2018 shooting at Marjory Stoneman Douglas High School in Parkland, Florida, for their advocacy for gun control measures.

from CNN.com - RSS Channel https://ift.tt/j5qRpL4
Continue Reading

The board overseeing Disney's special taxing district -- which was appointed by and is aligned with Florida Gov. Ron DeSantis -- voted on Monday to sue the company days after the entertainment giant filed its own lawsuit against the board.

from CNN.com - RSS Channel https://ift.tt/2mMJPeX
Continue Reading